Changeset 52082 in webkit for trunk/JavaScriptCore/runtime


Ignore:
Timestamp:
Dec 14, 2009, 12:13:24 AM (15 years ago)
Author:
[email protected]
Message:

JavaScriptCore: Changed GC from mark-sweep to mark-allocate.

Reviewed by Sam Weinig.

Added WeakGCMap to keep WebCore blissfully ignorant about objects that
have become garbage but haven't run their destructors yet.

1% SunSpider speedup.
7.6% v8 speedup (37% splay speedup).
17% speedup on bench-alloc-nonretained.js.
18% speedup on bench-alloc-retained.js.

  • API/JSBase.cpp:

(JSGarbageCollect):

files.

  • debugger/Debugger.cpp:

(JSC::Debugger::recompileAllJSFunctions): Updated to use the Collector
iterator abstraction.

  • jsc.cpp:

(functionGC): Updated for rename.

  • runtime/Collector.cpp: Slightly reduced the number of allocations per

collection, so that small workloads only allocate on collector block,
rather than two.

(JSC::Heap::Heap): Updated to use the new allocateBlock function.

(JSC::Heap::destroy): Updated to use the new freeBlocks function.

(JSC::Heap::allocateBlock): New function to initialize a block when
allocating it.

(JSC::Heap::freeBlock): Consolidated the responsibility for running
destructors into this function.

(JSC::Heap::freeBlocks): Updated to use freeBlock.

(JSC::Heap::recordExtraCost): Sweep the heap in this reporting function,
so that allocation, which is more common, doesn't have to check extraCost.

(JSC::Heap::heapAllocate): Run destructors right before recycling a
garbage cell. This has better cache utilization than a separate sweep phase.

(JSC::Heap::resizeBlocks):
(JSC::Heap::growBlocks):
(JSC::Heap::shrinkBlocks): New set of functions for managing the size of
the heap, now that the heap doesn't maintain any information about its
size.

(JSC::isPointerAligned):
(JSC::isHalfCellAligned):
(JSC::isPossibleCell):
(JSC::isCellAligned):
(JSC::Heap::markConservatively): Cleaned up this code a bit.

(JSC::Heap::clearMarkBits):
(JSC::Heap::markedCells): Some helper functions for examining the the mark
bitmap.

(JSC::Heap::sweep): Simplified this function by using a DeadObjectIterator.

(JSC::Heap::markRoots): Reordered some operations for clarity.

(JSC::Heap::objectCount):
(JSC::Heap::addToStatistics):
(JSC::Heap::statistics): Rewrote these functions to calculate an object
count on demand, since the heap doesn't maintain this information by
itself.

(JSC::Heap::reset): New function for resetting the heap once we've
exhausted heap space.

(JSC::Heap::collectAllGarbage): This function matches the old collect()
behavior, but it's now an uncommon function used only by API.

  • runtime/Collector.h:

(JSC::CollectorBitmap::count):
(JSC::CollectorBitmap::isEmpty): Added some helper functions for managing
the collector mark bitmap.

(JSC::Heap::reportExtraMemoryCost): Changed reporting from cell equivalents
to bytes, so it's easier to understand.

  • runtime/CollectorHeapIterator.h:

(JSC::CollectorHeapIterator::CollectorHeapIterator):
(JSC::CollectorHeapIterator::operator!=):
(JSC::CollectorHeapIterator::operator*):
(JSC::CollectorHeapIterator::advance):
(JSC::::LiveObjectIterator):
(JSC::::operator):
(JSC::::DeadObjectIterator):
(JSC::::ObjectIterator): New iterators for encapsulating details about
heap layout, and what's live and dead on the heap.

  • runtime/JSArray.cpp:

(JSC::JSArray::putSlowCase):
(JSC::JSArray::increaseVectorLength): Delay reporting extra cost until
we're fully constructed, so the heap mark phase won't visit us in an
invalid state.

  • runtime/JSCell.h:

(JSC::JSCell::):
(JSC::JSCell::createDummyStructure):
(JSC::JSCell::JSCell):

  • runtime/JSGlobalData.cpp:

(JSC::JSGlobalData::JSGlobalData):

  • runtime/JSGlobalData.h: Added a dummy cell to simplify allocation logic.
  • runtime/JSString.h:

(JSC::jsSubstring): Don't report extra cost for substrings, since they
share a buffer that's already reported extra cost.

  • runtime/Tracing.d:
  • runtime/Tracing.h: Changed these dtrace hooks not to report object

counts, since they're no longer cheap to compute.

  • runtime/UString.h: Updated for renames.
  • runtime/WeakGCMap.h: Added.

(JSC::WeakGCMap::isEmpty):
(JSC::WeakGCMap::uncheckedGet):
(JSC::WeakGCMap::uncheckedBegin):
(JSC::WeakGCMap::uncheckedEnd):
(JSC::::get):
(JSC::::take):
(JSC::::set):
(JSC::::uncheckedRemove): Mentioned above.

  • wtf/StdLibExtras.h:

(WTF::bitCount): Added a bit population count function, so the heap can
count live objects to fulfill statistics questions.

JavaScriptGlue: Changed GC from mark-sweep to mark-allocate.

Reviewed by Sam Weinig.

  • JavaScriptGlue.cpp:

(JSCollect): Updated for rename. Fixed a bug where JSGlue would not check
to avoid nested GC calls.

WebCore: Changed GC from mark-sweep to mark-allocate.

Reviewed by Sam Weinig.

  • ForwardingHeaders/runtime/WeakGCMap.h: Added.
  • bindings/js/GCController.cpp:

(WebCore::collect):
(WebCore::GCController::gcTimerFired):
(WebCore::GCController::garbageCollectNow): Updated for rename.

  • bindings/js/JSDOMBinding.cpp:

(WebCore::removeWrappers):
(WebCore::hasCachedDOMObjectWrapperUnchecked):
(WebCore::hasCachedDOMObjectWrapper):
(WebCore::hasCachedDOMNodeWrapperUnchecked):
(WebCore::forgetDOMObject):
(WebCore::forgetDOMNode):
(WebCore::isObservableThroughDOM):
(WebCore::markDOMNodesForDocument):
(WebCore::markDOMObjectWrapper):
(WebCore::markDOMNodeWrapper):

  • bindings/js/JSDOMBinding.h: Changed DOM wrapper maps to be WeakGCMaps.

Don't ASSERT that an item must be in the WeakGCMap when its destructor
runs, since it might have been overwritten in the map first.

  • bindings/js/JSDocumentCustom.cpp:

(WebCore::toJS): Changed Document from a DOM object wrapper to a DOM node
wrapper, to simplify some code.

  • bindings/js/JSInspectedObjectWrapper.cpp:

(WebCore::JSInspectedObjectWrapper::JSInspectedObjectWrapper):
(WebCore::JSInspectedObjectWrapper::~JSInspectedObjectWrapper):

  • bindings/js/JSInspectorCallbackWrapper.cpp: Use a WeakGCMap for these

wrappers.

  • bindings/js/JSNodeCustom.cpp:

(WebCore::JSNode::markChildren): Updated for WeakGCMap and Document using
a DOM node wrapper instead of a DOM object wrapper.

  • bindings/js/JSSVGPODTypeWrapper.h:

(WebCore::JSSVGDynamicPODTypeWrapperCache::wrapperMap):
(WebCore::JSSVGDynamicPODTypeWrapperCache::lookupOrCreateWrapper):
(WebCore::JSSVGDynamicPODTypeWrapperCache::forgetWrapper):
(WebCore::::~JSSVGDynamicPODTypeWrapper): Shined a small beam of sanity light
on this code. Use hashtable-based lookup in JSSVGPODTypeWrapper.h instead
of linear lookup through iteration, since that's what hashtables were
invented for. Make JSSVGPODTypeWrapper.h responsible for reomving itself
from the table, instead of its JS wrapper, to decouple these objects from
GC, and because these objects are refCounted, not solely owned by their
JS wrappers.

  • bindings/scripts/CodeGeneratorJS.pm:
  • dom/Document.h: Adopted changes above.
Location:
trunk/JavaScriptCore/runtime
Files:
1 added
11 edited

Legend:

Unmodified
Added
Removed
  • trunk/JavaScriptCore/runtime/Collector.cpp

    r52047 r52082  
    105105const size_t GROWTH_FACTOR = 2;
    106106const size_t LOW_WATER_FACTOR = 4;
    107 const size_t ALLOCATIONS_PER_COLLECTION = 4000;
     107const size_t ALLOCATIONS_PER_COLLECTION = 3600;
    108108// This value has to be a macro to be used in max() without introducing
    109109// a PIC branch in Mach-O binaries, see <rdar://problem/5971391>.
     
    149149{
    150150    ASSERT(globalData);
    151 
     151   
    152152#if PLATFORM(SYMBIAN)
    153153    // Symbian OpenC supports mmap but currently not the MAP_ANON flag.
     
    171171   
    172172    memset(&primaryHeap, 0, sizeof(CollectorHeap));
     173    allocateBlock<PrimaryHeap>();
     174
    173175    memset(&numberHeap, 0, sizeof(CollectorHeap));
     176#if USE(JSVALUE32)
     177    allocateBlock<NumberHeap>();
     178#endif
    174179}
    175180
     
    194199    m_markListSet = 0;
    195200
    196     sweep<PrimaryHeap>();
    197     // No need to sweep number heap, because the JSNumber destructor doesn't do anything.
    198 #if ENABLE(JSC_ZOMBIES)
    199     ASSERT(primaryHeap.numLiveObjects == primaryHeap.numZombies);
    200 #else
    201     ASSERT(!primaryHeap.numLiveObjects);
    202 #endif
    203     freeBlocks(&primaryHeap);
    204     freeBlocks(&numberHeap);
     201    freeBlocks<PrimaryHeap>();
     202    freeBlocks<NumberHeap>();
    205203
    206204#if ENABLE(JSC_MULTIPLE_THREADS)
     
    226224#if PLATFORM(DARWIN)
    227225    vm_address_t address = 0;
    228     // FIXME: tag the region as a JavaScriptCore heap when we get a registered VM tag: <rdar://problem/6054788>.
    229226    vm_map(current_task(), &address, BLOCK_SIZE, BLOCK_OFFSET_MASK, VM_FLAGS_ANYWHERE | VM_TAG_FOR_COLLECTOR_MEMORY, MEMORY_OBJECT_NULL, 0, FALSE, VM_PROT_DEFAULT, VM_PROT_DEFAULT, VM_INHERIT_DEFAULT);
    230227#elif PLATFORM(SYMBIAN)
     
    234231        CRASH();
    235232    uintptr_t address = reinterpret_cast<uintptr_t>(mask);
    236 
    237     memset(reinterpret_cast<void*>(address), 0, BLOCK_SIZE);
    238233#elif PLATFORM(WINCE)
    239234    void* address = VirtualAlloc(NULL, BLOCK_SIZE, MEM_COMMIT | MEM_RESERVE, PAGE_READWRITE);
     
    248243    void* address;
    249244    posix_memalign(&address, BLOCK_SIZE, BLOCK_SIZE);
    250     memset(address, 0, BLOCK_SIZE);
    251245#else
    252246
     
    274268
    275269    address += adjust;
    276     memset(reinterpret_cast<void*>(address), 0, BLOCK_SIZE);
    277 #endif
     270#endif
     271
     272    // Initialize block.
    278273
    279274    CollectorBlock* block = reinterpret_cast<CollectorBlock*>(address);
    280     block->freeList = block->cells;
    281275    block->heap = this;
    282276    block->type = heapType;
     277    clearMarkBits<heapType>(block);
     278
     279    // heapAllocate assumes that it's safe to call a destructor on any cell in the primary heap.
     280    if (heapType != NumberHeap) {
     281        Structure* dummyMarkableCellStructure = m_globalData->dummyMarkableCellStructure.get();
     282        for (size_t i = 0; i < HeapConstants<heapType>::cellsPerBlock; ++i)
     283            new (block->cells + i) JSCell(dummyMarkableCellStructure);
     284    }
     285   
     286    // Add block to blocks vector.
    283287
    284288    CollectorHeap& heap = heapType == PrimaryHeap ? primaryHeap : numberHeap;
     
    302306    CollectorHeap& heap = heapType == PrimaryHeap ? primaryHeap : numberHeap;
    303307
     308    if (heapType != NumberHeap) {
     309        ObjectIterator<heapType> it(heap, block);
     310        ObjectIterator<heapType> end(heap, block + 1);
     311        for ( ; it != end; ++it)
     312            (*it)->~JSCell();
     313    }
    304314    freeBlock(heap.blocks[block]);
    305315
     
    335345}
    336346
    337 void Heap::freeBlocks(CollectorHeap* heap)
    338 {
    339     for (size_t i = 0; i < heap->usedBlocks; ++i)
    340         if (heap->blocks[i])
    341             freeBlock(heap->blocks[i]);
    342     fastFree(heap->blocks);
    343     memset(heap, 0, sizeof(CollectorHeap));
     347template <HeapType heapType>
     348void Heap::freeBlocks()
     349{
     350    CollectorHeap& heap = heapType == PrimaryHeap ? primaryHeap : numberHeap;
     351
     352    while (heap.usedBlocks)
     353        freeBlock<heapType>(0);
     354    fastFree(heap.blocks);
     355    memset(&heap, 0, sizeof(CollectorHeap));
    344356}
    345357
     
    358370    // NOTE: we target the primaryHeap unconditionally as JSNumber doesn't modify cost
    359371
     372    if (primaryHeap.extraCost > maxExtraCost && primaryHeap.extraCost > primaryHeap.usedBlocks * BLOCK_SIZE / 2) {
     373        // If the last iteration through the heap deallocated blocks, we need
     374        // to clean up remaining garbage before marking. Otherwise, the conservative
     375        // marking mechanism might follow a pointer to unmapped memory.
     376        if (primaryHeap.didShrink)
     377            sweep<PrimaryHeap>();
     378        reset();
     379    }
    360380    primaryHeap.extraCost += cost;
    361381}
     
    365385    typedef typename HeapConstants<heapType>::Block Block;
    366386    typedef typename HeapConstants<heapType>::Cell Cell;
    367 
     387   
    368388    CollectorHeap& heap = heapType == PrimaryHeap ? primaryHeap : numberHeap;
     389
    369390    ASSERT(JSLock::lockCount() > 0);
    370391    ASSERT(JSLock::currentThreadIsHoldingLock());
     
    373394    ASSERT(heap.operationInProgress == NoOperation);
    374395    ASSERT(heapType == PrimaryHeap || heap.extraCost == 0);
    375     // FIXME: If another global variable access here doesn't hurt performance
    376     // too much, we could CRASH() in NDEBUG builds, which could help ensure we
    377     // don't spend any time debugging cases where we allocate inside an object's
    378     // deallocation code.
    379396
    380397#if COLLECT_ON_EVERY_ALLOCATION
    381     collect();
    382 #endif
    383 
    384     size_t numLiveObjects = heap.numLiveObjects;
    385     size_t usedBlocks = heap.usedBlocks;
    386     size_t i = heap.firstBlockWithPossibleSpace;
    387 
    388     // if we have a huge amount of extra cost, we'll try to collect even if we still have
    389     // free cells left.
    390     if (heapType == PrimaryHeap && heap.extraCost > ALLOCATIONS_PER_COLLECTION) {
    391         size_t numLiveObjectsAtLastCollect = heap.numLiveObjectsAtLastCollect;
    392         size_t numNewObjects = numLiveObjects - numLiveObjectsAtLastCollect;
    393         const size_t newCost = numNewObjects + heap.extraCost;
    394         if (newCost >= ALLOCATIONS_PER_COLLECTION && newCost >= numLiveObjectsAtLastCollect)
    395             goto collect;
    396     }
    397 
     398    collectAllGarbage();
    398399    ASSERT(heap.operationInProgress == NoOperation);
    399 #ifndef NDEBUG
    400     // FIXME: Consider doing this in NDEBUG builds too (see comment above).
    401     heap.operationInProgress = Allocation;
    402 #endif
    403 
    404 scan:
    405     Block* targetBlock;
    406     size_t targetBlockUsedCells;
    407     if (i != usedBlocks) {
    408         targetBlock = reinterpret_cast<Block*>(heap.blocks[i]);
    409         targetBlockUsedCells = targetBlock->usedCells;
    410         ASSERT(targetBlockUsedCells <= HeapConstants<heapType>::cellsPerBlock);
    411         while (targetBlockUsedCells == HeapConstants<heapType>::cellsPerBlock) {
    412             if (++i == usedBlocks)
    413                 goto collect;
    414             targetBlock = reinterpret_cast<Block*>(heap.blocks[i]);
    415             targetBlockUsedCells = targetBlock->usedCells;
    416             ASSERT(targetBlockUsedCells <= HeapConstants<heapType>::cellsPerBlock);
    417         }
    418         heap.firstBlockWithPossibleSpace = i;
    419     } else {
    420 
    421 collect:
    422         size_t numLiveObjectsAtLastCollect = heap.numLiveObjectsAtLastCollect;
    423         size_t numNewObjects = numLiveObjects - numLiveObjectsAtLastCollect;
    424         const size_t newCost = numNewObjects + heap.extraCost;
    425 
    426         if (newCost >= ALLOCATIONS_PER_COLLECTION && newCost >= numLiveObjectsAtLastCollect) {
    427 #ifndef NDEBUG
    428             heap.operationInProgress = NoOperation;
    429 #endif
    430             bool foundGarbage = collect();
    431             numLiveObjects = heap.numLiveObjects;
    432             usedBlocks = heap.usedBlocks;
    433             i = heap.firstBlockWithPossibleSpace;
    434 #ifndef NDEBUG
    435             heap.operationInProgress = Allocation;
    436 #endif
    437             if (foundGarbage)
    438                 goto scan;
    439         }
    440 
    441         // didn't find a block, and GC didn't reclaim anything, need to allocate a new block
    442         targetBlock = reinterpret_cast<Block*>(allocateBlock<heapType>());
    443         heap.firstBlockWithPossibleSpace = heap.usedBlocks - 1;
    444         targetBlockUsedCells = 0;
    445     }
    446 
    447     // find a free spot in the block and detach it from the free list
    448     Cell* newCell = targetBlock->freeList;
    449 
    450     // "next" field is a cell offset -- 0 means next cell, so a zeroed block is already initialized
    451     targetBlock->freeList = (newCell + 1) + newCell->u.freeCell.next;
    452 
    453     targetBlock->usedCells = static_cast<uint32_t>(targetBlockUsedCells + 1);
    454     heap.numLiveObjects = numLiveObjects + 1;
    455 
    456 #ifndef NDEBUG
    457     // FIXME: Consider doing this in NDEBUG builds too (see comment above).
    458     heap.operationInProgress = NoOperation;
    459 #endif
    460 
    461     return newCell;
     400#endif
     401
     402allocate:
     403
     404    // Fast case: find the next garbage cell and recycle it.
     405
     406    do {
     407        ASSERT(heap.nextBlock < heap.usedBlocks);
     408        Block* block = reinterpret_cast<Block*>(heap.blocks[heap.nextBlock]);
     409        do {
     410            ASSERT(heap.nextCell < HeapConstants<heapType>::cellsPerBlock);
     411            if (!block->marked.get(heap.nextCell >> HeapConstants<heapType>::bitmapShift)) { // Always false for the last cell in the block
     412                Cell* cell = block->cells + heap.nextCell;
     413                if (heapType != NumberHeap) {
     414                    heap.operationInProgress = Allocation;
     415                    JSCell* imp = reinterpret_cast<JSCell*>(cell);
     416                    imp->~JSCell();
     417                    heap.operationInProgress = NoOperation;
     418                }
     419                ++heap.nextCell;
     420                return cell;
     421            }
     422        } while (++heap.nextCell != HeapConstants<heapType>::cellsPerBlock);
     423        heap.nextCell = 0;
     424    } while (++heap.nextBlock != heap.usedBlocks);
     425
     426    // Slow case: reached the end of the heap. Mark live objects and start over.
     427
     428    reset();
     429    goto allocate;
     430}
     431
     432template <HeapType heapType>
     433void Heap::resizeBlocks()
     434{
     435    CollectorHeap& heap = heapType == PrimaryHeap ? primaryHeap : numberHeap;
     436
     437    heap.didShrink = false;
     438
     439    size_t usedCellCount = markedCells<heapType>();
     440    size_t minCellCount = usedCellCount + max(ALLOCATIONS_PER_COLLECTION, usedCellCount);
     441    size_t minBlockCount = (minCellCount + HeapConstants<heapType>::cellsPerBlock - 1) / HeapConstants<heapType>::cellsPerBlock;
     442
     443    size_t maxCellCount = 1.25f * minCellCount;
     444    size_t maxBlockCount = (maxCellCount + HeapConstants<heapType>::cellsPerBlock - 1) / HeapConstants<heapType>::cellsPerBlock;
     445
     446    if (heap.usedBlocks < minBlockCount)
     447        growBlocks<heapType>(minBlockCount);
     448    else if (heap.usedBlocks > maxBlockCount)
     449        shrinkBlocks<heapType>(maxBlockCount);
     450}
     451
     452template <HeapType heapType>
     453void Heap::growBlocks(size_t neededBlocks)
     454{
     455    CollectorHeap& heap = heapType == PrimaryHeap ? primaryHeap : numberHeap;
     456    ASSERT(heap.usedBlocks < neededBlocks);
     457    while (heap.usedBlocks < neededBlocks)
     458        allocateBlock<heapType>();
     459}
     460
     461template <HeapType heapType>
     462void Heap::shrinkBlocks(size_t neededBlocks)
     463{
     464    CollectorHeap& heap = heapType == PrimaryHeap ? primaryHeap : numberHeap;
     465    ASSERT(heap.usedBlocks > neededBlocks);
     466   
     467    // Clear the always-on last bit, so isEmpty() isn't fooled by it.
     468    for (size_t i = 0; i < heap.usedBlocks; ++i)
     469        heap.blocks[i]->marked.clear((HeapConstants<heapType>::cellsPerBlock - 1) >> HeapConstants<heapType>::bitmapShift);
     470
     471    for (size_t i = 0; i != heap.usedBlocks && heap.usedBlocks != neededBlocks; ) {
     472        if (heap.blocks[i]->marked.isEmpty()) {
     473            freeBlock<heapType>(i);
     474            heap.didShrink = true;
     475        } else
     476            ++i;
     477    }
     478
     479    // Reset the always-on last bit.
     480    for (size_t i = 0; i < heap.usedBlocks; ++i)
     481        heap.blocks[i]->marked.set((HeapConstants<heapType>::cellsPerBlock - 1) >> HeapConstants<heapType>::bitmapShift);
    462482}
    463483
     
    715735#endif
    716736
    717 #define IS_POINTER_ALIGNED(p) (((intptr_t)(p) & (sizeof(char*) - 1)) == 0)
    718 
    719 // cell size needs to be a power of two for this to be valid
    720 #define IS_HALF_CELL_ALIGNED(p) (((intptr_t)(p) & (CELL_MASK >> 1)) == 0)
     737inline bool isPointerAligned(void* p)
     738{
     739    return (((intptr_t)(p) & (sizeof(char*) - 1)) == 0);
     740}
     741
     742// Cell size needs to be a power of two for isPossibleCell to be valid.
     743COMPILE_ASSERT(sizeof(CollectorCell) % 2 == 0, Collector_cell_size_is_power_of_two);
     744
     745#if USE(JSVALUE32)
     746static bool isHalfCellAligned(void *p)
     747{
     748    return (((intptr_t)(p) & (CELL_MASK >> 1)) == 0);
     749}
     750
     751static inline bool isPossibleCell(void* p)
     752{
     753    return isHalfCellAligned(p) && p;
     754}
     755
     756#else
     757
     758static inline bool isCellAligned(void *p)
     759{
     760    return (((intptr_t)(p) & CELL_MASK) == 0);
     761}
     762
     763static inline bool isPossibleCell(void* p)
     764{
     765    return isCellAligned(p) && p;
     766}
     767#endif
    721768
    722769void Heap::markConservatively(MarkStack& markStack, void* start, void* end)
     
    729776
    730777    ASSERT((static_cast<char*>(end) - static_cast<char*>(start)) < 0x1000000);
    731     ASSERT(IS_POINTER_ALIGNED(start));
    732     ASSERT(IS_POINTER_ALIGNED(end));
     778    ASSERT(isPointerAligned(start));
     779    ASSERT(isPointerAligned(end));
    733780
    734781    char** p = static_cast<char**>(start);
    735782    char** e = static_cast<char**>(end);
    736783
    737     size_t usedPrimaryBlocks = primaryHeap.usedBlocks;
    738     size_t usedNumberBlocks = numberHeap.usedBlocks;
    739784    CollectorBlock** primaryBlocks = primaryHeap.blocks;
     785#if USE(JSVALUE32)
    740786    CollectorBlock** numberBlocks = numberHeap.blocks;
    741 
    742     const size_t lastCellOffset = sizeof(CollectorCell) * (CELLS_PER_BLOCK - 1);
     787#endif
    743788
    744789    while (p != e) {
    745790        char* x = *p++;
    746         if (IS_HALF_CELL_ALIGNED(x) && x) {
     791        if (isPossibleCell(x)) {
    747792            uintptr_t xAsBits = reinterpret_cast<uintptr_t>(x);
    748793            xAsBits &= CELL_ALIGN_MASK;
     794
    749795            uintptr_t offset = xAsBits & BLOCK_OFFSET_MASK;
     796            const size_t lastCellOffset = sizeof(CollectorCell) * (CELLS_PER_BLOCK - 1);
     797            if (offset > lastCellOffset)
     798                continue;
     799
    750800            CollectorBlock* blockAddr = reinterpret_cast<CollectorBlock*>(xAsBits - offset);
     801#if USE(JSVALUE32)
    751802            // Mark the the number heap, we can mark these Cells directly to avoid the virtual call cost
     803            size_t usedNumberBlocks = numberHeap.usedBlocks;
    752804            for (size_t block = 0; block < usedNumberBlocks; block++) {
    753                 if ((numberBlocks[block] == blockAddr) & (offset <= lastCellOffset)) {
    754                     Heap::markCell(reinterpret_cast<JSCell*>(xAsBits));
    755                     goto endMarkLoop;
    756                 }
     805                if (numberBlocks[block] != blockAddr)
     806                    continue;
     807                Heap::markCell(reinterpret_cast<JSCell*>(xAsBits));
     808                goto endMarkLoop;
    757809            }
    758          
     810#endif
     811
    759812            // Mark the primary heap
     813            size_t usedPrimaryBlocks = primaryHeap.usedBlocks;
    760814            for (size_t block = 0; block < usedPrimaryBlocks; block++) {
    761                 if ((primaryBlocks[block] == blockAddr) & (offset <= lastCellOffset)) {
    762                     if (reinterpret_cast<CollectorCell*>(xAsBits)->u.freeCell.zeroIfFree) {
    763                         markStack.append(reinterpret_cast<JSCell*>(xAsBits));
    764                         markStack.drain();
    765                     }
    766                     break;
    767                 }
     815                if (primaryBlocks[block] != blockAddr)
     816                    continue;
     817                markStack.append(reinterpret_cast<JSCell*>(xAsBits));
     818                markStack.drain();
    768819            }
     820#if USE(JSVALUE32)
    769821        endMarkLoop:
    770822            ;
     823#endif
    771824        }
    772825    }
     
    10101063}
    10111064
    1012 template <HeapType heapType> size_t Heap::sweep()
    1013 {
    1014     typedef typename HeapConstants<heapType>::Block Block;
    1015     typedef typename HeapConstants<heapType>::Cell Cell;
    1016 
    1017     // SWEEP: delete everything with a zero refcount (garbage) and unmark everything else
     1065template <HeapType heapType>
     1066void Heap::clearMarkBits()
     1067{
    10181068    CollectorHeap& heap = heapType == PrimaryHeap ? primaryHeap : numberHeap;
     1069    for (size_t i = 0; i < heap.usedBlocks; ++i)
     1070        clearMarkBits<heapType>(heap.blocks[i]);
     1071}
     1072
     1073template <HeapType heapType>
     1074void Heap::clearMarkBits(CollectorBlock* block)
     1075{
     1076    // heapAllocate assumes that the last cell in every block is marked.
     1077    block->marked.clearAll();
     1078    block->marked.set((HeapConstants<heapType>::cellsPerBlock - 1) >> HeapConstants<heapType>::bitmapShift);
     1079}
     1080
     1081template <HeapType heapType>
     1082size_t Heap::markedCells(size_t startBlock, size_t startCell) const
     1083{
     1084    const CollectorHeap& heap = heapType == PrimaryHeap ? primaryHeap : numberHeap;
     1085    ASSERT(startBlock <= heap.usedBlocks);
     1086    ASSERT(startCell < HeapConstants<heapType>::cellsPerBlock);
     1087
     1088    if (startBlock >= heap.usedBlocks)
     1089        return 0;
     1090
     1091    size_t result = 0;
     1092    result += heap.blocks[startBlock]->marked.count(startCell);
     1093    for (size_t i = startBlock + 1; i < heap.usedBlocks; ++i)
     1094        result += heap.blocks[i]->marked.count();
     1095
     1096    return result;
     1097}
     1098
     1099template <HeapType heapType>
     1100void Heap::sweep()
     1101{
     1102    ASSERT(heapType != NumberHeap); // The number heap does not contain meaningful destructors.
     1103
     1104    CollectorHeap& heap = heapType == PrimaryHeap ? primaryHeap : numberHeap;
     1105
     1106    ASSERT(heap.operationInProgress == NoOperation);
     1107    if (heap.operationInProgress != NoOperation)
     1108        CRASH();
     1109    heap.operationInProgress = Collection;
    10191110   
    1020     size_t emptyBlocks = 0;
    1021     size_t numLiveObjects = heap.numLiveObjects;
    1022    
    1023     for (size_t block = 0; block < heap.usedBlocks; block++) {
    1024         Block* curBlock = reinterpret_cast<Block*>(heap.blocks[block]);
    1025        
    1026         size_t usedCells = curBlock->usedCells;
    1027         Cell* freeList = curBlock->freeList;
    1028        
    1029         if (usedCells == HeapConstants<heapType>::cellsPerBlock) {
    1030             // special case with a block where all cells are used -- testing indicates this happens often
    1031             for (size_t i = 0; i < HeapConstants<heapType>::cellsPerBlock; i++) {
    1032                 if (!curBlock->marked.get(i >> HeapConstants<heapType>::bitmapShift)) {
    1033                     Cell* cell = curBlock->cells + i;
    1034                    
    1035                     if (heapType != NumberHeap) {
    1036                         JSCell* imp = reinterpret_cast<JSCell*>(cell);
    1037                         // special case for allocated but uninitialized object
    1038                         // (We don't need this check earlier because nothing prior this point
    1039                         // assumes the object has a valid vptr.)
    1040                         if (cell->u.freeCell.zeroIfFree == 0)
    1041                             continue;
     1111#if !ENABLE(JSC_ZOMBIES)
     1112    Structure* dummyMarkableCellStructure = m_globalData->dummyMarkableCellStructure.get();
     1113#endif
     1114
     1115    DeadObjectIterator<heapType> it(heap, heap.nextBlock, heap.nextCell);
     1116    DeadObjectIterator<heapType> end(heap, heap.usedBlocks);
     1117    for ( ; it != end; ++it) {
     1118        JSCell* cell = *it;
    10421119#if ENABLE(JSC_ZOMBIES)
    1043                         if (!imp->isZombie()) {
    1044                             const ClassInfo* info = imp->classInfo();
    1045                             imp->~JSCell();
    1046                             new (imp) JSZombie(info, JSZombie::leakedZombieStructure());
    1047                             heap.numZombies++;
    1048                         }
    1049 #else
    1050                         imp->~JSCell();
    1051 #endif
    1052                     }
    1053                     --numLiveObjects;
    1054 #if !ENABLE(JSC_ZOMBIES)
    1055                     --usedCells;
    1056                    
    1057                     // put cell on the free list
    1058                     cell->u.freeCell.zeroIfFree = 0;
    1059                     cell->u.freeCell.next = freeList - (cell + 1);
    1060                     freeList = cell;
    1061 #endif
    1062                 }
    1063             }
    1064         } else {
    1065             size_t minimumCellsToProcess = usedCells;
    1066             for (size_t i = 0; (i < minimumCellsToProcess) & (i < HeapConstants<heapType>::cellsPerBlock); i++) {
    1067                 Cell* cell = curBlock->cells + i;
    1068                 if (cell->u.freeCell.zeroIfFree == 0) {
    1069                     ++minimumCellsToProcess;
    1070                 } else {
    1071                     if (!curBlock->marked.get(i >> HeapConstants<heapType>::bitmapShift)) {
    1072                         if (heapType != NumberHeap) {
    1073                             JSCell* imp = reinterpret_cast<JSCell*>(cell);
    1074 #if ENABLE(JSC_ZOMBIES)
    1075                             if (!imp->isZombie()) {
    1076                                 const ClassInfo* info = imp->classInfo();
    1077                                 imp->~JSCell();
    1078                                 new (imp) JSZombie(info, JSZombie::leakedZombieStructure());
    1079                                 heap.numZombies++;
    1080                             }
    1081 #else
    1082                             imp->~JSCell();
    1083 #endif
    1084                         }
    1085 #if !ENABLE(JSC_ZOMBIES)
    1086                         --usedCells;
    1087                         --numLiveObjects;
    1088                        
    1089                         // put cell on the free list
    1090                         cell->u.freeCell.zeroIfFree = 0;
    1091                         cell->u.freeCell.next = freeList - (cell + 1);
    1092                         freeList = cell;
    1093 #endif
    1094                     }
    1095                 }
    1096             }
     1120        if (!cell->isZombie()) {
     1121            const ClassInfo* info = cell->classInfo();
     1122            cell->~JSCell();
     1123            new (cell) JSZombie(info, JSZombie::leakedZombieStructure());
     1124            Heap::markCell(cell);
    10971125        }
    1098        
    1099         curBlock->usedCells = static_cast<uint32_t>(usedCells);
    1100         curBlock->freeList = freeList;
    1101         curBlock->marked.clearAll();
    1102        
    1103         if (!usedCells)
    1104             ++emptyBlocks;
    1105     }
    1106    
    1107     if (heap.numLiveObjects != numLiveObjects)
    1108         heap.firstBlockWithPossibleSpace = 0;
    1109    
    1110     heap.numLiveObjects = numLiveObjects;
    1111     heap.numLiveObjectsAtLastCollect = numLiveObjects;
    1112     heap.extraCost = 0;
    1113    
    1114     if (!emptyBlocks)
    1115         return numLiveObjects;
    1116 
    1117     size_t neededCells = 1.25f * (numLiveObjects + max(ALLOCATIONS_PER_COLLECTION, numLiveObjects));
    1118     size_t neededBlocks = (neededCells + HeapConstants<heapType>::cellsPerBlock - 1) / HeapConstants<heapType>::cellsPerBlock;
    1119     for (size_t block = 0; block < heap.usedBlocks; block++) {
    1120         if (heap.usedBlocks <= neededBlocks)
    1121             break;
    1122 
    1123         Block* curBlock = reinterpret_cast<Block*>(heap.blocks[block]);
    1124         if (curBlock->usedCells)
    1125             continue;
    1126 
    1127         freeBlock<heapType>(block);
    1128         block--; // Don't move forward a step in this case
    1129     }
    1130 
    1131     return numLiveObjects;
    1132 }
    1133 
    1134 bool Heap::collect()
     1126#else
     1127        cell->~JSCell();
     1128        // Callers of sweep assume it's safe to mark any cell in the heap.
     1129        new (cell) JSCell(dummyMarkableCellStructure);
     1130#endif
     1131    }
     1132
     1133    heap.operationInProgress = NoOperation;
     1134}
     1135
     1136void Heap::markRoots()
    11351137{
    11361138#ifndef NDEBUG
     
    11411143#endif
    11421144
    1143     ASSERT((primaryHeap.operationInProgress == NoOperation) | (numberHeap.operationInProgress == NoOperation));
    1144     if ((primaryHeap.operationInProgress != NoOperation) | (numberHeap.operationInProgress != NoOperation))
     1145    ASSERT((primaryHeap.operationInProgress == NoOperation) & (numberHeap.operationInProgress == NoOperation));
     1146    if (!((primaryHeap.operationInProgress == NoOperation) & (numberHeap.operationInProgress == NoOperation)))
    11451147        CRASH();
    11461148
    1147     JAVASCRIPTCORE_GC_BEGIN();
    11481149    primaryHeap.operationInProgress = Collection;
    11491150    numberHeap.operationInProgress = Collection;
    11501151
    1151     // MARK: first mark all referenced objects recursively starting out from the set of root objects
    11521152    MarkStack& markStack = m_globalData->markStack;
     1153
     1154    // Reset mark bits.
     1155    clearMarkBits<PrimaryHeap>();
     1156    clearMarkBits<NumberHeap>();
     1157
     1158    // Mark stack roots.
    11531159    markStackObjectsConservatively(markStack);
     1160    m_globalData->interpreter->registerFile().markCallFrames(markStack, this);
     1161
     1162    // Mark explicitly registered roots.
    11541163    markProtectedObjects(markStack);
     1164
     1165    // Mark misc. other roots.
    11551166    if (m_markListSet && m_markListSet->size())
    11561167        MarkedArgumentBuffer::markLists(markStack, *m_markListSet);
    11571168    if (m_globalData->exception)
    11581169        markStack.append(m_globalData->exception);
    1159     m_globalData->interpreter->registerFile().markCallFrames(markStack, this);
    11601170    m_globalData->smallStrings.markChildren(markStack);
    11611171    if (m_globalData->functionCodeBlockBeingReparsed)
     
    11661176    markStack.drain();
    11671177    markStack.compact();
    1168     JAVASCRIPTCORE_GC_MARKED();
    1169 
    1170     size_t originalLiveObjects = primaryHeap.numLiveObjects + numberHeap.numLiveObjects;
    1171     size_t numLiveObjects = sweep<PrimaryHeap>();
    1172     numLiveObjects += sweep<NumberHeap>();
    11731178
    11741179    primaryHeap.operationInProgress = NoOperation;
    11751180    numberHeap.operationInProgress = NoOperation;
    1176     JAVASCRIPTCORE_GC_END(originalLiveObjects, numLiveObjects);
    1177 
    1178     return numLiveObjects < originalLiveObjects;
    1179 }
    1180 
    1181 size_t Heap::objectCount()
    1182 {
    1183     return primaryHeap.numLiveObjects + numberHeap.numLiveObjects - m_globalData->smallStrings.count();
     1181}
     1182
     1183size_t Heap::objectCount() const
     1184{
     1185    return objectCount<PrimaryHeap>() + objectCount<NumberHeap>();
    11841186}
    11851187
    11861188template <HeapType heapType>
    1187 static void addToStatistics(Heap::Statistics& statistics, const CollectorHeap& heap)
    1188 {
    1189     typedef HeapConstants<heapType> HC;
    1190     for (size_t i = 0; i < heap.usedBlocks; ++i) {
    1191         if (heap.blocks[i]) {
    1192             statistics.size += BLOCK_SIZE;
    1193             statistics.free += (HC::cellsPerBlock - heap.blocks[i]->usedCells) * HC::cellSize;
    1194         }
    1195     }
     1189size_t Heap::objectCount() const
     1190{
     1191    const CollectorHeap& heap = heapType == PrimaryHeap ? primaryHeap : numberHeap;
     1192
     1193    return heap.nextBlock * HeapConstants<heapType>::cellsPerBlock // allocated full blocks
     1194           + heap.nextCell // allocated cells in current block
     1195           + markedCells<heapType>(heap.nextBlock, heap.nextCell) // marked cells in remainder of heap
     1196           - heap.usedBlocks; // 1 cell per block is a dummy sentinel
     1197}
     1198
     1199template <HeapType heapType>
     1200void Heap::addToStatistics(Heap::Statistics& statistics) const
     1201{
     1202    const CollectorHeap& heap = heapType == PrimaryHeap ? primaryHeap : numberHeap;
     1203
     1204    statistics.size += heap.usedBlocks * BLOCK_SIZE;
     1205    statistics.free += heap.usedBlocks * BLOCK_SIZE - (objectCount<heapType>() * HeapConstants<heapType>::cellSize);
    11961206}
    11971207
     
    11991209{
    12001210    Statistics statistics = { 0, 0 };
    1201     JSC::addToStatistics<PrimaryHeap>(statistics, primaryHeap);
    1202     JSC::addToStatistics<NumberHeap>(statistics, numberHeap);
     1211    addToStatistics<PrimaryHeap>(statistics);
     1212    addToStatistics<NumberHeap>(statistics);
    12031213    return statistics;
    12041214}
     
    12721282}
    12731283
    1274 Heap::iterator Heap::primaryHeapBegin()
    1275 {
    1276     return iterator(primaryHeap.blocks, primaryHeap.blocks + primaryHeap.usedBlocks);
    1277 }
    1278 
    1279 Heap::iterator Heap::primaryHeapEnd()
    1280 {
    1281     return iterator(primaryHeap.blocks + primaryHeap.usedBlocks, primaryHeap.blocks + primaryHeap.usedBlocks);
     1284void Heap::reset()
     1285{
     1286    JAVASCRIPTCORE_GC_BEGIN();
     1287
     1288    markRoots();
     1289
     1290    JAVASCRIPTCORE_GC_MARKED();
     1291
     1292    primaryHeap.nextCell = 0;
     1293    primaryHeap.nextBlock = 0;
     1294    primaryHeap.extraCost = 0;
     1295#if ENABLE(JSC_ZOMBIES)
     1296    sweep<PrimaryHeap>();
     1297#endif
     1298    resizeBlocks<PrimaryHeap>();
     1299
     1300#if USE(JSVALUE32)
     1301    numberHeap.nextCell = 0;
     1302    numberHeap.nextBlock = 0;
     1303    resizeBlocks<NumberHeap>();
     1304#endif
     1305
     1306    JAVASCRIPTCORE_GC_END();
     1307}
     1308
     1309void Heap::collectAllGarbage()
     1310{
     1311    JAVASCRIPTCORE_GC_BEGIN();
     1312
     1313    // If the last iteration through the heap deallocated blocks, we need
     1314    // to clean up remaining garbage before marking. Otherwise, the conservative
     1315    // marking mechanism might follow a pointer to unmapped memory.
     1316    if (primaryHeap.didShrink)
     1317        sweep<PrimaryHeap>();
     1318
     1319    markRoots();
     1320
     1321    JAVASCRIPTCORE_GC_MARKED();
     1322
     1323    primaryHeap.nextCell = 0;
     1324    primaryHeap.nextBlock = 0;
     1325    primaryHeap.extraCost = 0;
     1326    sweep<PrimaryHeap>();
     1327    resizeBlocks<PrimaryHeap>();
     1328
     1329#if USE(JSVALUE32)
     1330    numberHeap.nextCell = 0;
     1331    numberHeap.nextBlock = 0;
     1332    resizeBlocks<NumberHeap>();
     1333#endif
     1334
     1335    JAVASCRIPTCORE_GC_END();
     1336}
     1337
     1338LiveObjectIterator<PrimaryHeap> Heap::primaryHeapBegin()
     1339{
     1340    return LiveObjectIterator<PrimaryHeap>(primaryHeap, 0);
     1341}
     1342
     1343LiveObjectIterator<PrimaryHeap> Heap::primaryHeapEnd()
     1344{
     1345    return LiveObjectIterator<PrimaryHeap>(primaryHeap, primaryHeap.usedBlocks);
    12821346}
    12831347
  • trunk/JavaScriptCore/runtime/Collector.h

    r52047 r52082  
    2929#include <wtf/Noncopyable.h>
    3030#include <wtf/OwnPtr.h>
     31#include <wtf/StdLibExtras.h>
    3132#include <wtf/Threading.h>
    3233
    33 // This is supremely lame that we require pthreads to build on windows.
    3434#if ENABLE(JSC_MULTIPLE_THREADS)
    3535#include <pthread.h>
     
    5050    enum HeapType { PrimaryHeap, NumberHeap };
    5151
    52     template <HeapType> class CollectorHeapIterator;
     52    template <HeapType> class LiveObjectIterator;
    5353
    5454    struct CollectorHeap {
     55        size_t nextBlock;
     56        size_t nextCell;
     57
    5558        CollectorBlock** blocks;
    5659        size_t numBlocks;
    5760        size_t usedBlocks;
    58         size_t firstBlockWithPossibleSpace;
    59 
    60         size_t numLiveObjects;
    61         size_t numLiveObjectsAtLastCollect;
     61
    6262        size_t extraCost;
    63 #if ENABLE(JSC_ZOMBIES)
    64         size_t numZombies;
    65 #endif
     63       
     64        bool didShrink;
    6665
    6766        OperationInProgress operationInProgress;
     
    7170    public:
    7271        class Thread;
    73         typedef CollectorHeapIterator<PrimaryHeap> iterator;
    7472
    7573        void destroy();
     
    7876        void* allocate(size_t);
    7977
    80         bool collect();
    8178        bool isBusy(); // true if an allocation or collection is in progress
    82 
    83         static const size_t minExtraCostSize = 256;
     79        void collectAllGarbage();
     80
     81        static const size_t minExtraCost = 256;
     82        static const size_t maxExtraCost = 1024 * 1024;
    8483
    8584        void reportExtraMemoryCost(size_t cost);
    8685
    87         size_t objectCount();
     86        size_t objectCount() const;
    8887        struct Statistics {
    8988            size_t size;
     
    115114        static bool isNumber(JSCell*);
    116115       
    117         // Iterators for the object heap.
    118         iterator primaryHeapBegin();
    119         iterator primaryHeapEnd();
     116        LiveObjectIterator<PrimaryHeap> primaryHeapBegin();
     117        LiveObjectIterator<PrimaryHeap> primaryHeapEnd();
    120118
    121119    private:
    122120        template <HeapType heapType> void* heapAllocate(size_t);
    123         template <HeapType heapType> size_t sweep();
     121        void reset();
     122        void collectRemainingGarbage();
     123        template <HeapType heapType> void sweep();
    124124        static CollectorBlock* cellBlock(const JSCell*);
    125125        static size_t cellOffset(const JSCell*);
     
    132132        template <HeapType heapType> NEVER_INLINE void freeBlock(size_t);
    133133        NEVER_INLINE void freeBlock(CollectorBlock*);
    134         void freeBlocks(CollectorHeap*);
     134        template <HeapType heapType> void freeBlocks();
     135        template <HeapType heapType> void resizeBlocks();
     136        template <HeapType heapType> void growBlocks(size_t neededBlocks);
     137        template <HeapType heapType> void shrinkBlocks(size_t neededBlocks);
     138        template <HeapType heapType> void clearMarkBits();
     139        template <HeapType heapType> void clearMarkBits(CollectorBlock*);
     140        template <HeapType heapType> size_t markedCells(size_t startBlock = 0, size_t startCell = 0) const;
    135141
    136142        void recordExtraCost(size_t);
     143
     144        template <HeapType heapType> void addToStatistics(Statistics&) const;
     145        template <HeapType heapType> size_t objectCount() const;
     146
     147        void markRoots();
    137148        void markProtectedObjects(MarkStack&);
    138149        void markCurrentThreadConservatively(MarkStack&);
     
    190201    const size_t CELL_MASK = CELL_SIZE - 1;
    191202    const size_t CELL_ALIGN_MASK = ~CELL_MASK;
    192     const size_t CELLS_PER_BLOCK = (BLOCK_SIZE * 8 - sizeof(uint32_t) * 8 - sizeof(void *) * 8 - 2 * (7 + 3 * 8)) / (CELL_SIZE * 8 + 2);
     203    const size_t CELLS_PER_BLOCK = (BLOCK_SIZE - sizeof(Heap*) - sizeof(HeapType)) * 8 * CELL_SIZE / (8 * CELL_SIZE + 1) / CELL_SIZE; // one bitmap byte can represent 8 cells.
     204   
    193205    const size_t SMALL_CELLS_PER_BLOCK = 2 * CELLS_PER_BLOCK;
    194206    const size_t BITMAP_SIZE = (CELLS_PER_BLOCK + 7) / 8;
    195207    const size_t BITMAP_WORDS = (BITMAP_SIZE + 3) / sizeof(uint32_t);
    196  
     208
    197209    struct CollectorBitmap {
    198210        uint32_t bits[BITMAP_WORDS];
     
    201213        void clear(size_t n) { bits[n >> 5] &= ~(1 << (n & 0x1F)); }
    202214        void clearAll() { memset(bits, 0, sizeof(bits)); }
     215        size_t count(size_t startCell = 0)
     216        {
     217            size_t result = 0;
     218            for ( ; (startCell & 0x1F) != 0; ++startCell) {
     219                if (get(startCell))
     220                    ++result;
     221            }
     222            for (size_t i = startCell >> 5; i < BITMAP_WORDS; ++i)
     223                result += WTF::bitCount(bits[i]);
     224            return result;
     225        }
     226        size_t isEmpty() // Much more efficient than testing count() == 0.
     227        {
     228            for (size_t i = 0; i < BITMAP_WORDS; ++i)
     229                if (bits[i] != 0)
     230                    return false;
     231            return true;
     232        }
    203233    };
    204234 
    205235    struct CollectorCell {
    206         union {
    207             double memory[CELL_ARRAY_LENGTH];
    208             struct {
    209                 void* zeroIfFree;
    210                 ptrdiff_t next;
    211             } freeCell;
    212         } u;
     236        double memory[CELL_ARRAY_LENGTH];
    213237    };
    214238
    215239    struct SmallCollectorCell {
    216         union {
    217             double memory[CELL_ARRAY_LENGTH / 2];
    218             struct {
    219                 void* zeroIfFree;
    220                 ptrdiff_t next;
    221             } freeCell;
    222         } u;
     240        double memory[CELL_ARRAY_LENGTH / 2];
    223241    };
    224242
     
    226244    public:
    227245        CollectorCell cells[CELLS_PER_BLOCK];
    228         uint32_t usedCells;
    229         CollectorCell* freeList;
    230246        CollectorBitmap marked;
    231247        Heap* heap;
     
    236252    public:
    237253        SmallCollectorCell cells[SMALL_CELLS_PER_BLOCK];
    238         uint32_t usedCells;
    239         SmallCollectorCell* freeList;
    240254        CollectorBitmap marked;
    241255        Heap* heap;
     
    288302    inline void Heap::reportExtraMemoryCost(size_t cost)
    289303    {
    290         if (cost > minExtraCostSize)
    291             recordExtraCost(cost / (CELL_SIZE * 2));
     304        if (cost > minExtraCost)
     305            recordExtraCost(cost);
    292306    }
    293307
  • trunk/JavaScriptCore/runtime/CollectorHeapIterator.h

    r52047 r52082  
    3232namespace JSC {
    3333
    34     template <HeapType heapType> class CollectorHeapIterator {
     34    class CollectorHeapIterator {
    3535    public:
    36         CollectorHeapIterator(CollectorBlock** block, CollectorBlock** endBlock);
    37 
    38         bool operator!=(const CollectorHeapIterator<heapType>& other) { return m_block != other.m_block || m_cell != other.m_cell; }
    39         CollectorHeapIterator<heapType>& operator++();
     36        bool operator!=(const CollectorHeapIterator& other);
    4037        JSCell* operator*() const;
    4138   
    42     private:
    43         typedef typename HeapConstants<heapType>::Block Block;
    44         typedef typename HeapConstants<heapType>::Cell Cell;
     39    protected:
     40        CollectorHeapIterator(CollectorHeap&, size_t startBlock, size_t startCell);
     41        void advance(size_t cellsPerBlock);
    4542
    46         Block** m_block;
    47         Block** m_endBlock;
    48         Cell* m_cell;
    49         Cell* m_endCell;
     43        CollectorHeap& m_heap;
     44        size_t m_block;
     45        size_t m_cell;
    5046    };
    5147
    52     template <HeapType heapType>
    53     CollectorHeapIterator<heapType>::CollectorHeapIterator(CollectorBlock** block, CollectorBlock** endBlock)
    54         : m_block(reinterpret_cast<Block**>(block))
    55         , m_endBlock(reinterpret_cast<Block**>(endBlock))
    56         , m_cell(m_block == m_endBlock ? 0 : (*m_block)->cells)
    57         , m_endCell(m_block == m_endBlock ? 0 : (*m_block)->cells + HeapConstants<heapType>::cellsPerBlock)
     48    template <HeapType heapType>
     49    class LiveObjectIterator : public CollectorHeapIterator {
     50    public:
     51        LiveObjectIterator(CollectorHeap&, size_t startBlock, size_t startCell = 0);
     52        LiveObjectIterator<heapType>& operator++();
     53    };
     54
     55    template <HeapType heapType>
     56    class DeadObjectIterator : public CollectorHeapIterator {
     57    public:
     58        DeadObjectIterator(CollectorHeap&, size_t startBlock, size_t startCell = 0);
     59        DeadObjectIterator<heapType>& operator++();
     60    };
     61
     62    template <HeapType heapType>
     63    class ObjectIterator : public CollectorHeapIterator {
     64    public:
     65        ObjectIterator(CollectorHeap&, size_t startBlock, size_t startCell = 0);
     66        ObjectIterator<heapType>& operator++();
     67    };
     68
     69    inline CollectorHeapIterator::CollectorHeapIterator(CollectorHeap& heap, size_t startBlock, size_t startCell)
     70        : m_heap(heap)
     71        , m_block(startBlock)
     72        , m_cell(startCell)
    5873    {
    59         if (m_cell && m_cell->u.freeCell.zeroIfFree == 0)
    60             ++*this;
    6174    }
    6275
    63     template <HeapType heapType>
    64     CollectorHeapIterator<heapType>& CollectorHeapIterator<heapType>::operator++()
     76    inline bool CollectorHeapIterator::operator!=(const CollectorHeapIterator& other)
    6577    {
     78        return m_block != other.m_block || m_cell != other.m_cell;
     79    }
     80
     81    inline JSCell* CollectorHeapIterator::operator*() const
     82    {
     83        return reinterpret_cast<JSCell*>(m_heap.blocks[m_block]->cells + m_cell);
     84    }
     85   
     86    inline void CollectorHeapIterator::advance(size_t cellsPerBlock)
     87    {
     88        ++m_cell;
     89        if (m_cell == cellsPerBlock) {
     90            m_cell = 0;
     91            ++m_block;
     92        }
     93    }
     94
     95    template <HeapType heapType>
     96    inline LiveObjectIterator<heapType>::LiveObjectIterator(CollectorHeap& heap, size_t startBlock, size_t startCell)
     97        : CollectorHeapIterator(heap, startBlock, startCell - 1)
     98    {
     99        ++(*this);
     100    }
     101
     102    template <HeapType heapType>
     103    inline LiveObjectIterator<heapType>& LiveObjectIterator<heapType>::operator++()
     104    {
     105        if (m_block < m_heap.nextBlock || m_cell < m_heap.nextCell) {
     106            advance(HeapConstants<heapType>::cellsPerBlock);
     107            return *this;
     108        }
     109
    66110        do {
    67             for (++m_cell; m_cell != m_endCell; ++m_cell)
    68                 if (m_cell->u.freeCell.zeroIfFree != 0) {
    69                     return *this;
    70                 }
    71 
    72             if (++m_block != m_endBlock) {
    73                 m_cell = (*m_block)->cells;
    74                 m_endCell = (*m_block)->cells + HeapConstants<heapType>::cellsPerBlock;
    75             }
    76         } while(m_block != m_endBlock);
    77 
    78         m_cell = 0;
     111            advance(HeapConstants<heapType>::cellsPerBlock);
     112        } while (m_block < m_heap.usedBlocks && !m_heap.blocks[m_block]->marked.get(m_cell));
    79113        return *this;
    80114    }
    81115
    82     template <HeapType heapType>
    83     JSCell* CollectorHeapIterator<heapType>::operator*() const
     116    template <HeapType heapType>
     117    inline DeadObjectIterator<heapType>::DeadObjectIterator(CollectorHeap& heap, size_t startBlock, size_t startCell)
     118        : CollectorHeapIterator(heap, startBlock, startCell - 1)
    84119    {
    85         return reinterpret_cast<JSCell*>(m_cell);
     120        ++(*this);
     121    }
     122
     123    template <HeapType heapType>
     124    inline DeadObjectIterator<heapType>& DeadObjectIterator<heapType>::operator++()
     125    {
     126        do {
     127            advance(HeapConstants<heapType>::cellsPerBlock);
     128            ASSERT(m_block > m_heap.nextBlock || (m_block == m_heap.nextBlock && m_cell >= m_heap.nextCell));
     129        } while (m_block < m_heap.usedBlocks && m_heap.blocks[m_block]->marked.get(m_cell));
     130        return *this;
     131    }
     132
     133    template <HeapType heapType>
     134    inline ObjectIterator<heapType>::ObjectIterator(CollectorHeap& heap, size_t startBlock, size_t startCell)
     135        : CollectorHeapIterator(heap, startBlock, startCell - 1)
     136    {
     137        ++(*this);
     138    }
     139
     140    template <HeapType heapType>
     141    inline ObjectIterator<heapType>& ObjectIterator<heapType>::operator++()
     142    {
     143        advance(HeapConstants<heapType>::cellsPerBlock);
     144        return *this;
    86145    }
    87146
  • trunk/JavaScriptCore/runtime/JSArray.cpp

    r52047 r52082  
    381381    unsigned vectorLength = m_vectorLength;
    382382
    383     Heap::heap(this)->reportExtraMemoryCost(storageSize(newVectorLength) - storageSize(vectorLength));
    384 
    385383    if (newNumValuesInVector == storage->m_numValuesInVector + 1) {
    386384        for (unsigned j = vectorLength; j < newVectorLength; ++j)
     
    403401
    404402    checkConsistency();
     403
     404    Heap::heap(this)->reportExtraMemoryCost(storageSize(newVectorLength) - storageSize(vectorLength));
    405405}
    406406
     
    493493        return false;
    494494
    495     Heap::heap(this)->reportExtraMemoryCost(storageSize(newVectorLength) - storageSize(vectorLength));
    496495    m_vectorLength = newVectorLength;
    497496
     
    500499
    501500    m_storage = storage;
     501
     502    Heap::heap(this)->reportExtraMemoryCost(storageSize(newVectorLength) - storageSize(vectorLength));
     503
    502504    return true;
    503505}
  • trunk/JavaScriptCore/runtime/JSCell.h

    r52047 r52082  
    4848    private:
    4949        explicit JSCell(Structure*);
    50         JSCell(); // Only used for initializing Collector blocks.
    5150        virtual ~JSCell();
    5251
    5352    public:
     53        static PassRefPtr<Structure> createDummyStructure()
     54        {
     55            return Structure::create(jsNull(), TypeInfo(UnspecifiedType));
     56        }
     57
    5458        // Querying the type.
    5559#if USE(JSVALUE32)
     
    123127    }
    124128
    125     // Only used for initializing Collector blocks.
    126     inline JSCell::JSCell()
    127     {
    128     }
    129 
    130129    inline JSCell::~JSCell()
    131130    {
  • trunk/JavaScriptCore/runtime/JSGlobalData.cpp

    r52047 r52082  
    127127    , getterSetterStructure(GetterSetter::createStructure(jsNull()))
    128128    , apiWrapperStructure(JSAPIValueWrapper::createStructure(jsNull()))
     129    , dummyMarkableCellStructure(JSCell::createDummyStructure())
    129130#if USE(JSVALUE32)
    130131    , numberStructure(JSNumberCell::createStructure(jsNull()))
  • trunk/JavaScriptCore/runtime/JSGlobalData.h

    r52047 r52082  
    124124        RefPtr<Structure> getterSetterStructure;
    125125        RefPtr<Structure> apiWrapperStructure;
     126        RefPtr<Structure> dummyMarkableCellStructure;
    126127
    127128#if USE(JSVALUE32)
  • trunk/JavaScriptCore/runtime/JSString.h

    r52075 r52082  
    423423                return globalData->smallStrings.singleCharacterString(globalData, c);
    424424        }
    425         return new (globalData) JSString(globalData, UString(UString::Rep::create(s.rep(), offset, length)));
     425        return new (globalData) JSString(globalData, UString(UString::Rep::create(s.rep(), offset, length)), JSString::HasOtherOwner);
    426426    }
    427427
  • trunk/JavaScriptCore/runtime/Tracing.d

    r52047 r52082  
    2828    probe gc__begin();
    2929    probe gc__marked();
    30     probe gc__end(int, int);
     30    probe gc__end();
    3131   
    3232    probe profile__will_execute(int, char*, char*, int);
  • trunk/JavaScriptCore/runtime/Tracing.h

    r52047 r52082  
    3434#define JAVASCRIPTCORE_GC_BEGIN_ENABLED() 0
    3535
    36 #define JAVASCRIPTCORE_GC_END(arg0, arg1)
     36#define JAVASCRIPTCORE_GC_END()
    3737#define JAVASCRIPTCORE_GC_END_ENABLED() 0
    3838
  • trunk/JavaScriptCore/runtime/UString.h

    r52075 r52082  
    513513    // FIXME: this should be size_t but that would cause warnings until we
    514514    // fix UString sizes to be size_t instead of int
    515     static const int minShareSize = Heap::minExtraCostSize / sizeof(UChar);
     515    static const int minShareSize = Heap::minExtraCost / sizeof(UChar);
    516516
    517517    inline size_t UString::cost() const
Note: See TracChangeset for help on using the changeset viewer.