Changeset 107445 in webkit for trunk/Source/JavaScriptCore/heap


Ignore:
Timestamp:
Feb 10, 2012, 2:44:09 PM (13 years ago)
Author:
[email protected]
Message:

Split MarkedSpace into destructor and destructor-free subspaces
https://bugs.webkit.org/show_bug.cgi?id=77761

Reviewed by Geoffrey Garen.

Source/JavaScriptCore:

  • dfg/DFGSpeculativeJIT.h:

(JSC::DFG::SpeculativeJIT::emitAllocateJSFinalObject): Switched over to use destructor-free space.

  • heap/Heap.h:

(JSC::Heap::allocatorForObjectWithoutDestructor): Added to give clients (e.g. the JIT) the ability to
pick which subspace they want to allocate out of.
(JSC::Heap::allocatorForObjectWithDestructor): Ditto.
(Heap):
(JSC::Heap::allocateWithDestructor): Added private function for CellAllocator to use.
(JSC):
(JSC::Heap::allocateWithoutDestructor): Ditto.

  • heap/MarkedAllocator.cpp: Added the cellsNeedDestruction flag to allocators so that they can allocate

their MarkedBlocks correctly.
(JSC::MarkedAllocator::allocateBlock):

  • heap/MarkedAllocator.h:

(JSC::MarkedAllocator::cellsNeedDestruction):
(MarkedAllocator):
(JSC::MarkedAllocator::MarkedAllocator):
(JSC):
(JSC::MarkedAllocator::init): Replaced custom set functions, which were only used upon initialization, with
an init function that does all of that stuff in fewer lines.

  • heap/MarkedBlock.cpp:

(JSC::MarkedBlock::create):
(JSC::MarkedBlock::recycle):
(JSC::MarkedBlock::MarkedBlock):
(JSC::MarkedBlock::callDestructor): Templatized, along with specializedSweep and sweepHelper, to make
checking the m_cellsNeedDestructor flag faster and cleaner looking.
(JSC):
(JSC::MarkedBlock::specializedSweep):
(JSC::MarkedBlock::sweep):
(JSC::MarkedBlock::sweepHelper):

  • heap/MarkedBlock.h:

(MarkedBlock):
(JSC::MarkedBlock::cellsNeedDestruction):
(JSC):

  • heap/MarkedSpace.cpp:

(JSC::MarkedSpace::MarkedSpace):
(JSC::MarkedSpace::resetAllocators):
(JSC::MarkedSpace::canonicalizeCellLivenessData):
(JSC::TakeIfUnmarked::operator()):

  • heap/MarkedSpace.h:

(MarkedSpace):
(Subspace):
(JSC::MarkedSpace::allocatorFor): Needed function to differentiate between the two broad subspaces of
allocators.
(JSC):
(JSC::MarkedSpace::destructorAllocatorFor): Ditto.
(JSC::MarkedSpace::allocateWithoutDestructor): Ditto.
(JSC::MarkedSpace::allocateWithDestructor): Ditto.
(JSC::MarkedSpace::forEachBlock):

  • jit/JIT.h:
  • jit/JITInlineMethods.h: Modified to use the proper allocator for JSFinalObjects and others.

(JSC::JIT::emitAllocateBasicJSObject):
(JSC::JIT::emitAllocateJSFinalObject):
(JSC::JIT::emitAllocateJSFunction):

  • runtime/JSArray.cpp:

(JSC):

  • runtime/JSArray.h:

(JSArray):
(JSC::JSArray::create):
(JSC):
(JSC::JSArray::tryCreateUninitialized):

  • runtime/JSCell.h:

(JSCell):
(JSC):
(NeedsDestructor): Template struct that calculates at compile time whether the class in question requires
destruction or not using the compiler type trait has_trivial_destructor. allocateCell then checks this
constant to decide whether to allocate in the destructor or destructor-free parts of the heap.
(JSC::allocateCell):

  • runtime/JSFunction.cpp:

(JSC):

  • runtime/JSFunction.h:

(JSFunction):

  • runtime/JSObject.cpp:

(JSC):

  • runtime/JSObject.h:

(JSNonFinalObject):
(JSC):
(JSFinalObject):
(JSC::JSFinalObject::create):

Source/WebCore:

No new tests.

  • bindings/js/JSDOMWindowShell.cpp: Removed old operator new, which was just used in the create

function so that we can use allocateCell instead.
(WebCore):

  • bindings/js/JSDOMWindowShell.h:

(WebCore::JSDOMWindowShell::create):
(JSDOMWindowShell):

  • bindings/scripts/CodeGeneratorJS.pm: Added destructor back to root JS DOM nodes (e.g. JSNode, etc)

because their destroy functions need to be called, so we don't want the NeedsDestructor struct to
think they don't need destruction due to having empty/trivial destructors.
Removed ASSERT_HAS_TRIVIAL_DESTRUCTOR from all JS DOM wrapper auto-generated objects because their
ancestors now have non-trivial destructors.
(GenerateHeader):
(GenerateImplementation):
(GenerateConstructorDefinition):

Location:
trunk/Source/JavaScriptCore/heap
Files:
7 edited

Legend:

Unmodified
Added
Removed
  • trunk/Source/JavaScriptCore/heap/Heap.h

    r106676 r107445  
    9696        inline bool isBusy();
    9797       
    98         MarkedAllocator& allocatorForObject(size_t bytes) { return m_objectSpace.allocatorFor(bytes); }
    99         void* allocate(size_t);
     98        MarkedAllocator& allocatorForObjectWithoutDestructor(size_t bytes) { return m_objectSpace.allocatorFor(bytes); }
     99        MarkedAllocator& allocatorForObjectWithDestructor(size_t bytes) { return m_objectSpace.destructorAllocatorFor(bytes); }
    100100        CheckedBoolean tryAllocateStorage(size_t, void**);
    101101        CheckedBoolean tryReallocateStorage(void**, size_t, size_t);
     
    143143        friend class SlotVisitor;
    144144        friend class CodeBlock;
     145        template<typename T> friend void* allocateCell(Heap&);
     146
     147        void* allocateWithDestructor(size_t);
     148        void* allocateWithoutDestructor(size_t);
    145149
    146150        size_t waterMark();
     
    335339    }
    336340
    337     inline void* Heap::allocate(size_t bytes)
     341    inline void* Heap::allocateWithDestructor(size_t bytes)
    338342    {
    339343        ASSERT(isValidAllocation(bytes));
    340         return m_objectSpace.allocate(bytes);
     344        return m_objectSpace.allocateWithDestructor(bytes);
     345    }
     346   
     347    inline void* Heap::allocateWithoutDestructor(size_t bytes)
     348    {
     349        ASSERT(isValidAllocation(bytes));
     350        return m_objectSpace.allocateWithoutDestructor(bytes);
    341351    }
    342352   
  • trunk/Source/JavaScriptCore/heap/MarkedAllocator.cpp

    r106677 r107445  
    9898    }
    9999    if (block)
    100         block = MarkedBlock::recycle(block, m_heap, m_cellSize);
     100        block = MarkedBlock::recycle(block, m_heap, m_cellSize, m_cellsNeedDestruction);
    101101    else if (allocationEffort == AllocationCanFail)
    102102        return 0;
    103103    else
    104         block = MarkedBlock::create(m_heap, m_cellSize);
     104        block = MarkedBlock::create(m_heap, m_cellSize, m_cellsNeedDestruction);
    105105   
    106106    m_markedSpace->didAddBlock(block);
  • trunk/Source/JavaScriptCore/heap/MarkedAllocator.h

    r106677 r107445  
    2323    void zapFreeList();
    2424    size_t cellSize() { return m_cellSize; }
     25    bool cellsNeedDestruction() { return m_cellsNeedDestruction; }
    2526    void* allocate();
    2627    Heap* heap() { return m_heap; }
     
    3031    void addBlock(MarkedBlock*);
    3132    void removeBlock(MarkedBlock*);
    32     void setHeap(Heap* heap) { m_heap = heap; }
    33     void setCellSize(size_t cellSize) { m_cellSize = cellSize; }
    34     void setMarkedSpace(MarkedSpace* space) { m_markedSpace = space; }
     33    void init(Heap*, MarkedSpace*, size_t cellSize, bool cellsNeedDestruction);
    3534   
    3635private:
     
    4443    DoublyLinkedList<HeapBlock> m_blockList;
    4544    size_t m_cellSize;
     45    bool m_cellsNeedDestruction;
    4646    Heap* m_heap;
    4747    MarkedSpace* m_markedSpace;
     
    5252    , m_currentBlock(0)
    5353    , m_cellSize(0)
     54    , m_cellsNeedDestruction(true)
    5455    , m_heap(0)
    5556    , m_markedSpace(0)
    5657{
    5758}
    58    
     59
     60inline void MarkedAllocator::init(Heap* heap, MarkedSpace* markedSpace, size_t cellSize, bool cellsNeedDestruction)
     61{
     62    m_heap = heap;
     63    m_markedSpace = markedSpace;
     64    m_cellSize = cellSize;
     65    m_cellsNeedDestruction = cellsNeedDestruction;
     66}
     67
    5968inline void* MarkedAllocator::allocate()
    6069{
  • trunk/Source/JavaScriptCore/heap/MarkedBlock.cpp

    r106686 r107445  
    3333namespace JSC {
    3434
    35 MarkedBlock* MarkedBlock::create(Heap* heap, size_t cellSize)
     35MarkedBlock* MarkedBlock::create(Heap* heap, size_t cellSize, bool cellsNeedDestruction)
    3636{
    3737    PageAllocationAligned allocation = PageAllocationAligned::allocate(blockSize, blockSize, OSAllocator::JSGCHeapPages);
    3838    if (!static_cast<bool>(allocation))
    3939        CRASH();
    40     return new (NotNull, allocation.base()) MarkedBlock(allocation, heap, cellSize);
    41 }
    42 
    43 MarkedBlock* MarkedBlock::recycle(MarkedBlock* block, Heap* heap, size_t cellSize)
    44 {
    45     return new (NotNull, block) MarkedBlock(block->m_allocation, heap, cellSize);
     40    return new (NotNull, allocation.base()) MarkedBlock(allocation, heap, cellSize, cellsNeedDestruction);
     41}
     42
     43MarkedBlock* MarkedBlock::recycle(MarkedBlock* block, Heap* heap, size_t cellSize, bool cellsNeedDestruction)
     44{
     45    return new (NotNull, block) MarkedBlock(block->m_allocation, heap, cellSize, cellsNeedDestruction);
    4646}
    4747
     
    5151}
    5252
    53 MarkedBlock::MarkedBlock(PageAllocationAligned& allocation, Heap* heap, size_t cellSize)
     53MarkedBlock::MarkedBlock(PageAllocationAligned& allocation, Heap* heap, size_t cellSize, bool cellsNeedDestruction)
    5454    : HeapBlock(allocation)
    5555    , m_atomsPerCell((cellSize + atomSize - 1) / atomSize)
    5656    , m_endAtom(atomsPerBlock - m_atomsPerCell + 1)
     57    , m_cellsNeedDestruction(cellsNeedDestruction)
    5758    , m_state(New) // All cells start out unmarked.
    5859    , m_heap(heap)
     
    7172    m_heap->m_destroyedTypeCounts.countVPtr(vptr);
    7273#endif
    73     if (cell->classInfo() != &JSFinalObject::s_info)
    74         cell->methodTable()->destroy(cell);
     74    ASSERT(cell->classInfo() != &JSFinalObject::s_info);
     75    cell->methodTable()->destroy(cell);
    7576
    7677    cell->zap();
    7778}
    7879
    79 template<MarkedBlock::BlockState blockState, MarkedBlock::SweepMode sweepMode>
     80template<MarkedBlock::BlockState blockState, MarkedBlock::SweepMode sweepMode, bool destructorCallNeeded>
    8081MarkedBlock::FreeCell* MarkedBlock::specializedSweep()
    8182{
    8283    ASSERT(blockState != Allocated && blockState != FreeListed);
     84    ASSERT(destructorCallNeeded || sweepMode != SweepOnly);
    8385
    8486    // This produces a free list that is ordered in reverse through the block.
     
    9496            continue;
    9597
    96         if (blockState != New)
     98        if (destructorCallNeeded && blockState != New)
    9799            callDestructor(cell);
    98100
     
    112114    HEAP_LOG_BLOCK_STATE_TRANSITION(this);
    113115
     116    if (sweepMode == SweepOnly && !m_cellsNeedDestruction)
     117        return 0;
     118
     119    if (m_cellsNeedDestruction)
     120        return sweepHelper<true>(sweepMode);
     121    return sweepHelper<false>(sweepMode);
     122}
     123
     124template<bool destructorCallNeeded>
     125MarkedBlock::FreeCell* MarkedBlock::sweepHelper(SweepMode sweepMode)
     126{
    114127    switch (m_state) {
    115128    case New:
    116129        ASSERT(sweepMode == SweepToFreeList);
    117         return specializedSweep<New, SweepToFreeList>();
     130        return specializedSweep<New, SweepToFreeList, destructorCallNeeded>();
    118131    case FreeListed:
    119132        // Happens when a block transitions to fully allocated.
     
    125138    case Marked:
    126139        return sweepMode == SweepToFreeList
    127             ? specializedSweep<Marked, SweepToFreeList>()
    128             : specializedSweep<Marked, SweepOnly>();
     140            ? specializedSweep<Marked, SweepToFreeList, destructorCallNeeded>()
     141            : specializedSweep<Marked, SweepOnly, destructorCallNeeded>();
    129142    case Zapped:
    130143        return sweepMode == SweepToFreeList
    131             ? specializedSweep<Zapped, SweepToFreeList>()
    132             : specializedSweep<Zapped, SweepOnly>();
     144            ? specializedSweep<Zapped, SweepToFreeList, destructorCallNeeded>()
     145            : specializedSweep<Zapped, SweepOnly, destructorCallNeeded>();
    133146    }
    134147
  • trunk/Source/JavaScriptCore/heap/MarkedBlock.h

    r106686 r107445  
    9090        };
    9191
    92         static MarkedBlock* create(Heap*, size_t cellSize);
    93         static MarkedBlock* recycle(MarkedBlock*, Heap*, size_t cellSize);
     92        static MarkedBlock* create(Heap*, size_t cellSize, bool cellsNeedDestruction);
     93        static MarkedBlock* recycle(MarkedBlock*, Heap*, size_t cellSize, bool cellsNeedDestruction);
    9494        static void destroy(MarkedBlock*);
    9595
     
    116116
    117117        size_t cellSize();
     118        bool cellsNeedDestruction();
    118119
    119120        size_t size();
     
    160161
    161162        enum BlockState { New, FreeListed, Allocated, Marked, Zapped };
     163        template<bool destructorCallNeeded> FreeCell* sweepHelper(SweepMode = SweepOnly);
    162164
    163165        typedef char Atom[atomSize];
    164166
    165         MarkedBlock(PageAllocationAligned&, Heap*, size_t cellSize);
     167        MarkedBlock(PageAllocationAligned&, Heap*, size_t cellSize, bool cellsNeedDestruction);
    166168        Atom* atoms();
    167169        size_t atomNumber(const void*);
    168170        void callDestructor(JSCell*);
    169         template<BlockState, SweepMode> FreeCell* specializedSweep();
     171        template<BlockState, SweepMode, bool destructorCallNeeded> FreeCell* specializedSweep();
    170172       
    171173#if ENABLE(GGC)
     
    180182        WTF::Bitmap<atomsPerBlock, WTF::BitmapNotAtomic> m_marks;
    181183#endif
     184        bool m_cellsNeedDestruction;
    182185        BlockState m_state;
    183186        Heap* m_heap;
     
    242245    {
    243246        return m_atomsPerCell * atomSize;
     247    }
     248
     249    inline bool MarkedBlock::cellsNeedDestruction()
     250    {
     251        return m_cellsNeedDestruction;
    244252    }
    245253
  • trunk/Source/JavaScriptCore/heap/MarkedSpace.cpp

    r106676 r107445  
    3737{
    3838    for (size_t cellSize = preciseStep; cellSize <= preciseCutoff; cellSize += preciseStep) {
    39         allocatorFor(cellSize).setCellSize(cellSize);
    40         allocatorFor(cellSize).setHeap(heap);
    41         allocatorFor(cellSize).setMarkedSpace(this);
     39        allocatorFor(cellSize).init(heap, this, cellSize, false);
     40        destructorAllocatorFor(cellSize).init(heap, this, cellSize, true);
    4241    }
    4342
    4443    for (size_t cellSize = impreciseStep; cellSize <= impreciseCutoff; cellSize += impreciseStep) {
    45         allocatorFor(cellSize).setCellSize(cellSize);
    46         allocatorFor(cellSize).setHeap(heap);
    47         allocatorFor(cellSize).setMarkedSpace(this);
     44        allocatorFor(cellSize).init(heap, this, cellSize, false);
     45        destructorAllocatorFor(cellSize).init(heap, this, cellSize, true);
    4846    }
    4947}
     
    5452    m_nurseryWaterMark = 0;
    5553
    56     for (size_t cellSize = preciseStep; cellSize <= preciseCutoff; cellSize += preciseStep)
     54    for (size_t cellSize = preciseStep; cellSize <= preciseCutoff; cellSize += preciseStep) {
    5755        allocatorFor(cellSize).reset();
     56        destructorAllocatorFor(cellSize).reset();
     57    }
    5858
    59     for (size_t cellSize = impreciseStep; cellSize <= impreciseCutoff; cellSize += impreciseStep)
     59    for (size_t cellSize = impreciseStep; cellSize <= impreciseCutoff; cellSize += impreciseStep) {
    6060        allocatorFor(cellSize).reset();
     61        destructorAllocatorFor(cellSize).reset();
     62    }
    6163}
    6264
    6365void MarkedSpace::canonicalizeCellLivenessData()
    6466{
    65     for (size_t cellSize = preciseStep; cellSize <= preciseCutoff; cellSize += preciseStep)
     67    for (size_t cellSize = preciseStep; cellSize <= preciseCutoff; cellSize += preciseStep) {
    6668        allocatorFor(cellSize).zapFreeList();
     69        destructorAllocatorFor(cellSize).zapFreeList();
     70    }
    6771
    68     for (size_t cellSize = impreciseStep; cellSize <= impreciseCutoff; cellSize += impreciseStep)
     72    for (size_t cellSize = impreciseStep; cellSize <= impreciseCutoff; cellSize += impreciseStep) {
    6973        allocatorFor(cellSize).zapFreeList();
     74        destructorAllocatorFor(cellSize).zapFreeList();
     75    }
    7076}
    7177
     
    108114        return;
    109115   
    110     m_markedSpace->allocatorFor(block->cellSize()).removeBlock(block);
     116    m_markedSpace->allocatorFor(block).removeBlock(block);
    111117    m_empties.append(block);
    112118}
  • trunk/Source/JavaScriptCore/heap/MarkedSpace.h

    r106676 r107445  
    5353
    5454    MarkedAllocator& allocatorFor(size_t);
    55     void* allocate(size_t);
     55    MarkedAllocator& allocatorFor(MarkedBlock*);
     56    MarkedAllocator& destructorAllocatorFor(size_t);
     57    void* allocateWithDestructor(size_t);
     58    void* allocateWithoutDestructor(size_t);
    5659   
    5760    void resetAllocators();
     
    8790    static const size_t impreciseCount = impreciseCutoff / impreciseStep;
    8891
    89     FixedArray<MarkedAllocator, preciseCount> m_preciseSizeClasses;
    90     FixedArray<MarkedAllocator, impreciseCount> m_impreciseSizeClasses;
     92    struct Subspace {
     93        FixedArray<MarkedAllocator, preciseCount> preciseAllocators;
     94        FixedArray<MarkedAllocator, impreciseCount> impreciseAllocators;
     95    };
     96
     97    Subspace m_destructorSpace;
     98    Subspace m_normalSpace;
     99
    91100    size_t m_waterMark;
    92101    size_t m_nurseryWaterMark;
     
    125134    ASSERT(bytes && bytes <= maxCellSize);
    126135    if (bytes <= preciseCutoff)
    127         return m_preciseSizeClasses[(bytes - 1) / preciseStep];
    128     return m_impreciseSizeClasses[(bytes - 1) / impreciseStep];
     136        return m_normalSpace.preciseAllocators[(bytes - 1) / preciseStep];
     137    return m_normalSpace.impreciseAllocators[(bytes - 1) / impreciseStep];
    129138}
    130139
    131 inline void* MarkedSpace::allocate(size_t bytes)
     140inline MarkedAllocator& MarkedSpace::allocatorFor(MarkedBlock* block)
     141{
     142    if (block->cellsNeedDestruction())
     143        return destructorAllocatorFor(block->cellSize());
     144    return allocatorFor(block->cellSize());
     145}
     146
     147inline MarkedAllocator& MarkedSpace::destructorAllocatorFor(size_t bytes)
     148{
     149    ASSERT(bytes && bytes <= maxCellSize);
     150    if (bytes <= preciseCutoff)
     151        return m_destructorSpace.preciseAllocators[(bytes - 1) / preciseStep];
     152    return m_destructorSpace.impreciseAllocators[(bytes - 1) / impreciseStep];
     153}
     154
     155inline void* MarkedSpace::allocateWithoutDestructor(size_t bytes)
    132156{
    133157    return allocatorFor(bytes).allocate();
     158}
     159
     160inline void* MarkedSpace::allocateWithDestructor(size_t bytes)
     161{
     162    return destructorAllocatorFor(bytes).allocate();
    134163}
    135164
     
    137166{
    138167    for (size_t i = 0; i < preciseCount; ++i) {
    139         m_preciseSizeClasses[i].forEachBlock(functor);
     168        m_normalSpace.preciseAllocators[i].forEachBlock(functor);
     169        m_destructorSpace.preciseAllocators[i].forEachBlock(functor);
    140170    }
    141171
    142172    for (size_t i = 0; i < impreciseCount; ++i) {
    143         m_impreciseSizeClasses[i].forEachBlock(functor);
     173        m_normalSpace.impreciseAllocators[i].forEachBlock(functor);
     174        m_destructorSpace.impreciseAllocators[i].forEachBlock(functor);
    144175    }
    145176
Note: See TracChangeset for help on using the changeset viewer.