Ignore:
Timestamp:
Sep 25, 2021, 2:55:41 PM (4 years ago)
Author:
[email protected]
Message:

Build an unlinked baseline JIT
https://bugs.webkit.org/show_bug.cgi?id=229223
<rdar://problem/82321772>

Reviewed by Yusuke Suzuki.

Source/JavaScriptCore:

This patch adds an "unlinked" baseline JIT to JSVALUE64 platforms. The JIT
code produced by this baseline JIT can be shared between all CodeBlocks that
share an UnlinkedCodeBlock. The benefit of this is, if we're creating a CodeBlock
from an UnlinkedCodeBlock that already compiled an unlinked baseline JIT
instance, this new CodeBlock just starts off executing in the baseline JIT
"for free".

To make this work, the code we emit now needs to be independent of a specific
CodeBlock instance. We use a CodeBlock instance for minimal profiling information
when compiling, but otherwise, the code is tied to the UnlinkedCodeBlock. When
we need CodeBlock specific information, we load it. This usually means things
like we'll load things from the Metadata dynamically. This patch also adds a
"linked constant pool" concept, and anytime we instantiate such a CodeBlock,
we also need to instantiate this "linked constant pool". This contains things
like our inline cache data structures (StructureStubInfo*), JSGlobalObject*,
etc.

Unlinked baseline JIT always runs ICs in the "data" mode. To make this work, I
made data ICs work on x86_64. To do this, we no longer call/ret to the IC.
Instead, we jump to the IC, and the IC jumps back by loading the "done" location
from the StructureStubInfo dynamically. This simplifies the design to not be
based on the arm64 calling convention, and keeps the same performance characteristics.

This patch also adds a new version of InlineAccess that is only used in baseline
JIT (for now). In the future, we can make the DFG/FTL also use this for Data
ICs. But we don't need to do that yet since those tiers don't use data ICs by
default. The baseline JIT now has a pure data IC approach to InlineAccess. So
instead of repatching code, we repatch fields we load dynamically.

This patch also cleans up a few things in OSR exit, where both DFG/FTL were
storing callee saves to the callee saves buffer in a weird place, and separate
from one another. I noticed this code can be simplified if we just store
callee saves at the end of the OSR exit handler, and from common JIT emission
code.

This patch also fixes a bug where we could end up with the wrong (and always
more negative) SP in the baseline JIT. This could happen when we OSR exit
from an inlined getter/setter. The OSR exit code puts the return PC when
returning to the call site of the getter/setter to be the inline cache's
"done location". However, this "done location" didn't used to restore SP.
This patch conservatively makes it so that we restore the SP at these sites.

This is measured as a 1% speedup on Speedometer2.

  • CMakeLists.txt:
  • JavaScriptCore.xcodeproj/project.pbxproj:
  • Sources.txt:
  • bytecode/AccessCase.cpp:

(JSC::AccessCase::fromStructureStubInfo):
(JSC::AccessCase::generateImpl):

  • bytecode/BytecodeList.rb:
  • bytecode/BytecodeOperandsForCheckpoint.h:

(JSC::valueProfileOffsetFor):

  • bytecode/CallLinkInfo.cpp:

(JSC::CallLinkInfo::fastPathStart):
(JSC::CallLinkInfo::emitFastPathImpl):
(JSC::CallLinkInfo::emitFastPath):
(JSC::CallLinkInfo::emitTailCallFastPath):
(JSC::CallLinkInfo::emitDataICFastPath):
(JSC::CallLinkInfo::emitTailCallDataICFastPath):
(JSC::CallLinkInfo::emitDataICSlowPath):
(JSC::CallLinkInfo::initializeDataIC):
(JSC::CallLinkInfo::emitDirectFastPath):
(JSC::CallLinkInfo::emitDirectTailCallFastPath):

  • bytecode/CallLinkInfo.h:

(JSC::CallLinkInfo::offsetOfMaxArgumentCountIncludingThis):
(JSC::CallLinkInfo::slowStub): Deleted.
(JSC::CallLinkInfo::addressOfMaxArgumentCountIncludingThis): Deleted.

  • bytecode/CodeBlock.cpp:

(JSC::CodeBlock::CodeBlock):
(JSC::CodeBlock::finishCreation):
(JSC::CodeBlock::setupWithUnlinkedBaselineCode):
(JSC::CodeBlock::isConstantOwnedByUnlinkedCodeBlock const):
(JSC::CodeBlock::setConstantRegisters):
(JSC::CodeBlock::finalizeJITInlineCaches):
(JSC::CodeBlock::finalizeUnconditionally):
(JSC::CodeBlock::frameRegisterCount):
(JSC::CodeBlock::binaryArithProfileForPC):
(JSC::CodeBlock::unaryArithProfileForPC):
(JSC::CodeBlock::findPC):
(JSC::CodeBlock::jitSoon):
(JSC::CodeBlock::jitNextInvocation):
(JSC::CodeBlock::dumpMathICStats):
(JSC::CodeBlock::finalizeBaselineJITInlineCaches): Deleted.
(JSC::CodeBlock::addJITAddIC): Deleted.
(JSC::CodeBlock::addJITMulIC): Deleted.
(JSC::CodeBlock::addJITSubIC): Deleted.
(JSC::CodeBlock::addJITNegIC): Deleted.
(JSC::CodeBlock::setPCToCodeOriginMap): Deleted.
(JSC::CodeBlock::thresholdForJIT): Deleted.
(JSC::CodeBlock::jitAfterWarmUp): Deleted.

  • bytecode/CodeBlock.h:

(JSC::CodeBlock::JITData::offsetOfJITConstantPool):
(JSC::CodeBlock::offsetOfJITData):
(JSC::CodeBlock::offsetOfArgumentValueProfiles):
(JSC::CodeBlock::offsetOfConstantsVectorBuffer):
(JSC::CodeBlock::baselineJITConstantPool):
(JSC::CodeBlock::checkIfJITThresholdReached):
(JSC::CodeBlock::dontJITAnytimeSoon):
(JSC::CodeBlock::llintExecuteCounter const):
(JSC::CodeBlock::offsetOfDebuggerRequests):
(JSC::CodeBlock::offsetOfShouldAlwaysBeInlined):
(JSC::CodeBlock::loopHintsAreEligibleForFuzzingEarlyReturn):
(JSC::CodeBlock::addressOfNumParameters): Deleted.
(JSC::CodeBlock::isKnownCell): Deleted.
(JSC::CodeBlock::addMathIC): Deleted.
(JSC::CodeBlock::setJITCodeMap): Deleted.
(JSC::CodeBlock::jitCodeMap): Deleted.
(JSC::CodeBlock::switchJumpTable): Deleted.
(JSC::CodeBlock::stringSwitchJumpTable): Deleted.

  • bytecode/CodeBlockInlines.h:

(JSC::CodeBlock::forEachValueProfile):
(JSC::CodeBlock::jitCodeMap):
(JSC::CodeBlock::baselineSwitchJumpTable):
(JSC::CodeBlock::baselineStringSwitchJumpTable):
(JSC::CodeBlock::dfgSwitchJumpTable):
(JSC::CodeBlock::dfgStringSwitchJumpTable):

  • bytecode/ExecutableToCodeBlockEdge.h:
  • bytecode/ExecutionCounter.cpp:

(JSC::ExecutionCounter<countingVariant>::setThreshold):

  • bytecode/ExecutionCounter.h:

(JSC::ExecutionCounter::clippedThreshold):

  • bytecode/GetByIdMetadata.h:

(JSC::GetByIdModeMetadataArrayLength::offsetOfArrayProfile):
(JSC::GetByIdModeMetadata::offsetOfMode):

  • bytecode/GetByStatus.cpp:

(JSC::GetByStatus::computeForStubInfoWithoutExitSiteFeedback):

  • bytecode/GetterSetterAccessCase.cpp:

(JSC::GetterSetterAccessCase::emitDOMJITGetter):

  • bytecode/InByStatus.cpp:

(JSC::InByStatus::computeForStubInfoWithoutExitSiteFeedback):

  • bytecode/InlineAccess.cpp:

(JSC::InlineAccess::generateSelfPropertyAccess):
(JSC::InlineAccess::canGenerateSelfPropertyReplace):
(JSC::InlineAccess::generateSelfPropertyReplace):
(JSC::InlineAccess::isCacheableArrayLength):
(JSC::InlineAccess::generateArrayLength):
(JSC::InlineAccess::isCacheableStringLength):
(JSC::InlineAccess::generateStringLength):
(JSC::InlineAccess::generateSelfInAccess):
(JSC::InlineAccess::rewireStubAsJumpInAccess):
(JSC::InlineAccess::resetStubAsJumpInAccess):

  • bytecode/InlineAccess.h:
  • bytecode/IterationModeMetadata.h:

(JSC::IterationModeMetadata::offsetOfSeenModes):

  • bytecode/LLIntCallLinkInfo.h:

(JSC::LLIntCallLinkInfo::offsetOfArrayProfile):

  • bytecode/Opcode.h:
  • bytecode/PolymorphicAccess.cpp:

(JSC::AccessGenerationState::succeed):
(JSC::AccessGenerationState::calculateLiveRegistersForCallAndExceptionHandling):
(JSC::AccessGenerationState::preserveLiveRegistersToStackForCallWithoutExceptions):
(JSC::PolymorphicAccess::regenerate):

  • bytecode/PolymorphicAccess.h:

(JSC::AccessGenerationState::preserveLiveRegistersToStackForCallWithoutExceptions): Deleted.

  • bytecode/PutByStatus.cpp:

(JSC::PutByStatus::computeForStubInfo):

  • bytecode/StructureStubInfo.cpp:

(JSC::StructureStubInfo::initGetByIdSelf):
(JSC::StructureStubInfo::initPutByIdReplace):
(JSC::StructureStubInfo::initInByIdSelf):
(JSC::StructureStubInfo::addAccessCase):
(JSC::StructureStubInfo::reset):
(JSC::StructureStubInfo::visitWeakReferences):
(JSC::StructureStubInfo::propagateTransitions):
(JSC::StructureStubInfo::initializeFromUnlinkedStructureStubInfo):

  • bytecode/StructureStubInfo.h:

(JSC::StructureStubInfo::offsetOfByIdSelfOffset):
(JSC::StructureStubInfo::offsetOfInlineAccessBaseStructure):
(JSC::StructureStubInfo::inlineAccessBaseStructure):
(JSC::StructureStubInfo::offsetOfDoneLocation):

  • bytecode/SuperSampler.cpp:

(JSC::printSuperSamplerState):

  • bytecode/UnlinkedCodeBlock.cpp:

(JSC::UnlinkedCodeBlock::UnlinkedCodeBlock):
(JSC::UnlinkedCodeBlock::hasIdentifier):
(JSC::UnlinkedCodeBlock::thresholdForJIT):
(JSC::UnlinkedCodeBlock::allocateSharedProfiles):

  • bytecode/UnlinkedCodeBlock.h:

(JSC::UnlinkedCodeBlock::constantRegister):
(JSC::UnlinkedCodeBlock::instructionAt const):
(JSC::UnlinkedCodeBlock::bytecodeOffset):
(JSC::UnlinkedCodeBlock::instructionsSize const):
(JSC::UnlinkedCodeBlock::loopHintsAreEligibleForFuzzingEarlyReturn):
(JSC::UnlinkedCodeBlock::outOfLineJumpOffset):
(JSC::UnlinkedCodeBlock::binaryArithProfile):
(JSC::UnlinkedCodeBlock::unaryArithProfile):
(JSC::UnlinkedCodeBlock::llintExecuteCounter):

  • bytecode/UnlinkedMetadataTable.h:

(JSC::UnlinkedMetadataTable::offsetInMetadataTable):

  • bytecode/ValueProfile.h:

(JSC::ValueProfileBase::ValueProfileBase):
(JSC::ValueProfileBase::clearBuckets):
(JSC::ValueProfile::offsetOfFirstBucket):

  • dfg/DFGCommonData.h:
  • dfg/DFGJITCode.cpp:
  • dfg/DFGJITCode.h:
  • dfg/DFGJITCompiler.cpp:

(JSC::DFG::JITCompiler::link):

  • dfg/DFGOSREntry.cpp:

(JSC::DFG::prepareOSREntry):

  • dfg/DFGOSRExit.cpp:

(JSC::DFG::OSRExit::compileExit):

  • dfg/DFGOSRExitCompilerCommon.cpp:

(JSC::DFG::handleExitCounts):
(JSC::DFG::callerReturnPC):
(JSC::DFG::reifyInlinedCallFrames):
(JSC::DFG::adjustAndJumpToTarget):

  • dfg/DFGOperations.cpp:

(JSC::DFG::JSC_DEFINE_JIT_OPERATION):

  • dfg/DFGSpeculativeJIT.cpp:

(JSC::DFG::SpeculativeJIT::compilePutPrivateName):
(JSC::DFG::SpeculativeJIT::compileValueAdd):
(JSC::DFG::SpeculativeJIT::compileValueSub):
(JSC::DFG::SpeculativeJIT::compileValueNegate):
(JSC::DFG::SpeculativeJIT::compileValueMul):
(JSC::DFG::SpeculativeJIT::compileLogShadowChickenTail):

  • dfg/DFGSpeculativeJIT32_64.cpp:

(JSC::DFG::SpeculativeJIT::emitCall):

  • dfg/DFGSpeculativeJIT64.cpp:

(JSC::DFG::SpeculativeJIT::emitCall):

  • ftl/FTLCompile.cpp:

(JSC::FTL::compile):

  • ftl/FTLJITCode.h:
  • ftl/FTLLink.cpp:

(JSC::FTL::link):

  • ftl/FTLLowerDFGToB3.cpp:

(JSC::FTL::DFG::LowerDFGToB3::addMathIC):
(JSC::FTL::DFG::LowerDFGToB3::compileUnaryMathIC):
(JSC::FTL::DFG::LowerDFGToB3::compileBinaryMathIC):
(JSC::FTL::DFG::LowerDFGToB3::compilePutPrivateName):
(JSC::FTL::DFG::LowerDFGToB3::compileCompareStrictEq):

  • ftl/FTLOSRExitCompiler.cpp:

(JSC::FTL::compileStub):

  • generator/Metadata.rb:
  • jit/AssemblyHelpers.cpp:

(JSC::AssemblyHelpers::storeProperty):
(JSC::AssemblyHelpers::emitVirtualCall):
(JSC::AssemblyHelpers::emitVirtualCallWithoutMovingGlobalObject):

  • jit/AssemblyHelpers.h:

(JSC::AssemblyHelpers::copyCalleeSavesToEntryFrameCalleeSavesBuffer):

  • jit/BaselineJITCode.cpp: Added.

(JSC::MathICHolder::addJITAddIC):
(JSC::MathICHolder::addJITMulIC):
(JSC::MathICHolder::addJITSubIC):
(JSC::MathICHolder::addJITNegIC):
(JSC::MathICHolder::adoptMathICs):
(JSC::BaselineJITCode::BaselineJITCode):
(JSC::BaselineJITCode::~BaselineJITCode):

  • jit/BaselineJITCode.h: Added.

(JSC::JITConstantPool::add):
(JSC::JITConstantPool::size const):
(JSC::JITConstantPool::at const):

  • jit/BaselineJITPlan.cpp:

(JSC::BaselineJITPlan::finalize):

  • jit/CCallHelpers.cpp:

(JSC::CCallHelpers::logShadowChickenTailPacketImpl):
(JSC::CCallHelpers::logShadowChickenTailPacket):

  • jit/CCallHelpers.h:
  • jit/CallFrameShuffleData.cpp:

(JSC::CallFrameShuffleData::setupCalleeSaveRegisters):

  • jit/CallFrameShuffleData.h:
  • jit/CallFrameShuffler.cpp:

(JSC::CallFrameShuffler::CallFrameShuffler):
(JSC::CallFrameShuffler::prepareForTailCall):

  • jit/CallFrameShuffler.h:

(JSC::CallFrameShuffler::snapshot const):

  • jit/JIT.cpp:

(JSC::JIT::JIT):
(JSC::JIT::emitEnterOptimizationCheck):
(JSC::JIT::emitNotifyWriteWatchpoint):
(JSC::JIT::emitVarReadOnlyCheck):
(JSC::JIT::assertStackPointerOffset):
(JSC::JIT::resetSP):
(JSC::JIT::emitPutCodeBlockToFrameInPrologue):
(JSC::JIT::privateCompileMainPass):
(JSC::JIT::privateCompileSlowCases):
(JSC::JIT::emitMaterializeMetadataAndConstantPoolRegisters):
(JSC::JIT::emitRestoreCalleeSaves):
(JSC::JIT::compileAndLinkWithoutFinalizing):
(JSC::JIT::link):
(JSC::JIT::finalizeOnMainThread):
(JSC::JIT::privateCompile):
(JSC::JIT::frameRegisterCountFor):
(JSC::JIT::stackPointerOffsetFor):

  • jit/JIT.h:
  • jit/JITArithmetic.cpp:

(JSC::JIT::emit_compareAndJumpSlowImpl):
(JSC::JIT::emit_compareAndJumpSlow):
(JSC::JIT::emit_op_negate):
(JSC::JIT::emit_op_add):
(JSC::JIT::emitMathICFast):
(JSC::JIT::emitMathICSlow):
(JSC::JIT::emit_op_div):
(JSC::JIT::emit_op_mul):
(JSC::JIT::emit_op_sub):

  • jit/JITCall.cpp:

(JSC::JIT::emitPutCallResult):
(JSC::JIT::compileSetupFrame):
(JSC::JIT::compileCallEval):
(JSC::JIT::compileCallEvalSlowCase):
(JSC::JIT::compileTailCall):
(JSC::JIT::compileOpCall):
(JSC::JIT::compileOpCallSlowCase):
(JSC::JIT::emit_op_iterator_open):
(JSC::JIT::emitSlow_op_iterator_open):
(JSC::JIT::emit_op_iterator_next):
(JSC::JIT::emitSlow_op_iterator_next):

  • jit/JITCall32_64.cpp:

(JSC::JIT::emitPutCallResult):
(JSC::JIT::compileSetupFrame):
(JSC::JIT::compileCallEval):
(JSC::JIT::compileCallEvalSlowCase):
(JSC::JIT::compileOpCall):
(JSC::JIT::compileOpCallSlowCase):
(JSC::JIT::emit_op_iterator_open):
(JSC::JIT::emitSlow_op_iterator_open):
(JSC::JIT::emit_op_iterator_next):
(JSC::JIT::emitSlow_op_iterator_next):

  • jit/JITCode.h:

(JSC::JITCode::useDataIC):
(JSC::JITCode::pcToCodeOriginMap):

  • jit/JITCompilationKey.cpp:

(JSC::JITCompilationKey::dump const):

  • jit/JITCompilationKey.h:

(JSC::JITCompilationKey::JITCompilationKey):
(JSC::JITCompilationKey::operator! const):
(JSC::JITCompilationKey::isHashTableDeletedValue const):
(JSC::JITCompilationKey::operator== const):
(JSC::JITCompilationKey::hash const):
(JSC::JITCompilationKey::profiledBlock const): Deleted.

  • jit/JITInlineCacheGenerator.cpp:

(JSC::JITInlineCacheGenerator::JITInlineCacheGenerator):
(JSC::JITInlineCacheGenerator::finalize):
(JSC::JITInlineCacheGenerator::generateBaselineDataICFastPath):
(JSC::JITGetByIdGenerator::JITGetByIdGenerator):
(JSC::generateGetByIdInlineAccess):
(JSC::JITGetByIdGenerator::generateBaselineDataICFastPath):
(JSC::JITGetByIdWithThisGenerator::generateBaselineDataICFastPath):
(JSC::JITPutByIdGenerator::generateBaselineDataICFastPath):
(JSC::JITDelByValGenerator::generateFastPath):
(JSC::JITDelByIdGenerator::generateFastPath):
(JSC::JITInByValGenerator::generateFastPath):
(JSC::JITInByIdGenerator::generateBaselineDataICFastPath):
(JSC::JITInstanceOfGenerator::generateFastPath):
(JSC::JITGetByValGenerator::generateFastPath):
(JSC::JITPutByValGenerator::generateFastPath):
(JSC::JITPrivateBrandAccessGenerator::generateFastPath):

  • jit/JITInlineCacheGenerator.h:
  • jit/JITInlines.h:

(JSC::JIT::isOperandConstantDouble):
(JSC::JIT::isOperandConstantInt):
(JSC::JIT::isKnownCell):
(JSC::JIT::getConstantOperand):
(JSC::JIT::appendCallWithExceptionCheckSetJSValueResultWithProfile):
(JSC::JIT::linkSlowCaseIfNotJSCell):
(JSC::JIT::advanceToNextCheckpoint):
(JSC::JIT::emitJumpSlowToHotForCheckpoint):
(JSC::JIT::isOperandConstantChar):
(JSC::JIT::emitValueProfilingSite):
(JSC::JIT::emitValueProfilingSiteIfProfiledOpcode):
(JSC::JIT::emitArrayProfilingSiteWithCell):
(JSC::JIT::emitLoadDouble):
(JSC::JIT::emitJumpSlowCaseIfNotJSCell):
(JSC::JIT::emitGetVirtualRegister):
(JSC::JIT::jumpTarget):
(JSC::JIT::loadPtrFromMetadata):
(JSC::JIT::load32FromMetadata):
(JSC::JIT::load8FromMetadata):
(JSC::JIT::store8ToMetadata):
(JSC::JIT::store32ToMetadata):
(JSC::JIT::materializePointerIntoMetadata):
(JSC::JIT::loadConstant):
(JSC::JIT::loadGlobalObject):
(JSC::JIT::loadCodeBlockConstant):
(JSC::JIT::copiedGetPutInfo): Deleted.
(JSC::JIT::copiedArithProfile): Deleted.

  • jit/JITOpcodes.cpp:

(JSC::JIT::emit_op_mov):
(JSC::JIT::emit_op_new_object):
(JSC::JIT::emitSlow_op_new_object):
(JSC::JIT::emit_op_overrides_has_instance):
(JSC::JIT::emit_op_instanceof):
(JSC::JIT::emitSlow_op_instanceof):
(JSC::JIT::emit_op_typeof_is_undefined):
(JSC::JIT::op_ret_handlerGenerator):
(JSC::JIT::emit_op_to_primitive):
(JSC::JIT::emit_op_set_function_name):
(JSC::JIT::emit_op_jfalse):
(JSC::JIT::valueIsFalseyGenerator):
(JSC::JIT::emit_op_jeq_null):
(JSC::JIT::emit_op_jneq_null):
(JSC::JIT::emit_op_jneq_ptr):
(JSC::JIT::emit_op_jtrue):
(JSC::JIT::valueIsTruthyGenerator):
(JSC::JIT::emit_op_throw):
(JSC::JIT::op_throw_handlerGenerator):
(JSC::JIT::emitSlow_op_jstricteq):
(JSC::JIT::emitSlow_op_jnstricteq):
(JSC::JIT::emit_op_to_number):
(JSC::JIT::emit_op_to_numeric):
(JSC::JIT::emit_op_to_object):
(JSC::JIT::emit_op_catch):
(JSC::JIT::emit_op_switch_imm):
(JSC::JIT::emit_op_switch_char):
(JSC::JIT::emit_op_switch_string):
(JSC::JIT::emit_op_debug):
(JSC::JIT::emit_op_eq_null):
(JSC::JIT::emit_op_neq_null):
(JSC::JIT::emit_op_enter):
(JSC::JIT::op_enter_handlerGenerator):
(JSC::JIT::emit_op_to_this):
(JSC::JIT::emit_op_create_this):
(JSC::JIT::emitSlow_op_eq):
(JSC::JIT::emitSlow_op_neq):
(JSC::JIT::emitSlow_op_jeq):
(JSC::JIT::emitSlow_op_jneq):
(JSC::JIT::emitSlow_op_instanceof_custom):
(JSC::JIT::emit_op_loop_hint):
(JSC::JIT::emitSlow_op_check_traps):
(JSC::JIT::op_check_traps_handlerGenerator):
(JSC::JIT::emit_op_new_regexp):
(JSC::JIT::emitNewFuncCommon):
(JSC::JIT::emitNewFuncExprCommon):
(JSC::JIT::emit_op_new_array):
(JSC::JIT::emit_op_new_array_with_size):
(JSC::JIT::emit_op_profile_type):
(JSC::JIT::emit_op_log_shadow_chicken_tail):
(JSC::JIT::emit_op_profile_control_flow):
(JSC::JIT::emit_op_get_argument):
(JSC::JIT::emit_op_get_prototype_of):

  • jit/JITOpcodes32_64.cpp:

(JSC::JIT::emit_op_new_object):
(JSC::JIT::emitSlow_op_new_object):
(JSC::JIT::emit_op_overrides_has_instance):
(JSC::JIT::emit_op_instanceof):
(JSC::JIT::emitSlow_op_instanceof):
(JSC::JIT::emitSlow_op_instanceof_custom):
(JSC::JIT::emit_op_typeof_is_undefined):
(JSC::JIT::emit_op_set_function_name):
(JSC::JIT::emit_op_jfalse):
(JSC::JIT::emit_op_jtrue):
(JSC::JIT::emit_op_jeq_null):
(JSC::JIT::emit_op_jneq_null):
(JSC::JIT::emit_op_jneq_ptr):
(JSC::JIT::emitSlow_op_eq):
(JSC::JIT::compileOpEqJumpSlow):
(JSC::JIT::emitSlow_op_neq):
(JSC::JIT::emitSlow_op_jstricteq):
(JSC::JIT::emitSlow_op_jnstricteq):
(JSC::JIT::emit_op_eq_null):
(JSC::JIT::emit_op_neq_null):
(JSC::JIT::emit_op_throw):
(JSC::JIT::emit_op_to_number):
(JSC::JIT::emit_op_to_numeric):
(JSC::JIT::emit_op_to_object):
(JSC::JIT::emit_op_catch):
(JSC::JIT::emit_op_switch_imm):
(JSC::JIT::emit_op_switch_char):
(JSC::JIT::emit_op_switch_string):
(JSC::JIT::emit_op_enter):
(JSC::JIT::emit_op_create_this):
(JSC::JIT::emit_op_to_this):
(JSC::JIT::emit_op_profile_type):
(JSC::JIT::emit_op_log_shadow_chicken_tail):

  • jit/JITOperations.cpp:

(JSC::JSC_DEFINE_JIT_OPERATION):

  • jit/JITOperations.h:
  • jit/JITPlan.cpp:

(JSC::JITPlan::key):

  • jit/JITPropertyAccess.cpp:

(JSC::JIT::emit_op_get_by_val):
(JSC::JIT::generateGetByValSlowCase):
(JSC::JIT::slow_op_get_by_val_prepareCallGenerator):
(JSC::JIT::emit_op_get_private_name):
(JSC::JIT::emitSlow_op_get_private_name):
(JSC::JIT::slow_op_get_private_name_prepareCallGenerator):
(JSC::JIT::emit_op_set_private_brand):
(JSC::JIT::emitSlow_op_set_private_brand):
(JSC::JIT::emit_op_check_private_brand):
(JSC::JIT::emitSlow_op_check_private_brand):
(JSC::JIT::emit_op_put_by_val):
(JSC::JIT::emitSlow_op_put_by_val):
(JSC::JIT::slow_op_put_by_val_prepareCallGenerator):
(JSC::JIT::emit_op_put_private_name):
(JSC::JIT::emitSlow_op_put_private_name):
(JSC::JIT::slow_op_put_private_name_prepareCallGenerator):
(JSC::JIT::emit_op_put_getter_by_id):
(JSC::JIT::emit_op_put_setter_by_id):
(JSC::JIT::emit_op_put_getter_setter_by_id):
(JSC::JIT::emit_op_put_getter_by_val):
(JSC::JIT::emit_op_put_setter_by_val):
(JSC::JIT::emit_op_del_by_id):
(JSC::JIT::emitSlow_op_del_by_id):
(JSC::JIT::slow_op_del_by_id_prepareCallGenerator):
(JSC::JIT::emit_op_del_by_val):
(JSC::JIT::emitSlow_op_del_by_val):
(JSC::JIT::slow_op_del_by_val_prepareCallGenerator):
(JSC::JIT::emit_op_try_get_by_id):
(JSC::JIT::emitSlow_op_try_get_by_id):
(JSC::JIT::emit_op_get_by_id_direct):
(JSC::JIT::emitSlow_op_get_by_id_direct):
(JSC::JIT::emit_op_get_by_id):
(JSC::JIT::emitSlow_op_get_by_id):
(JSC::JIT::emit_op_get_by_id_with_this):
(JSC::JIT::slow_op_get_by_id_prepareCallGenerator):
(JSC::JIT::emitSlow_op_get_by_id_with_this):
(JSC::JIT::slow_op_get_by_id_with_this_prepareCallGenerator):
(JSC::JIT::emit_op_put_by_id):
(JSC::JIT::emitSlow_op_put_by_id):
(JSC::JIT::slow_op_put_by_id_prepareCallGenerator):
(JSC::JIT::emit_op_in_by_id):
(JSC::JIT::emitSlow_op_in_by_id):
(JSC::JIT::emit_op_in_by_val):
(JSC::JIT::emitSlow_op_in_by_val):
(JSC::JIT::emitHasPrivate):
(JSC::JIT::emitHasPrivateSlow):
(JSC::JIT::emitSlow_op_has_private_name):
(JSC::JIT::emitSlow_op_has_private_brand):
(JSC::JIT::emitVarInjectionCheck):
(JSC::JIT::emitResolveClosure):
(JSC::JIT::emit_op_resolve_scope):
(JSC::JIT::generateOpResolveScopeThunk):
(JSC::JIT::slow_op_resolve_scopeGenerator):
(JSC::JIT::emit_op_get_from_scope):
(JSC::JIT::generateOpGetFromScopeThunk):
(JSC::JIT::slow_op_get_from_scopeGenerator):
(JSC::JIT::emit_op_put_to_scope):
(JSC::JIT::emitSlow_op_put_to_scope):
(JSC::JIT::slow_op_put_to_scopeGenerator):
(JSC::JIT::emit_op_get_from_arguments):
(JSC::JIT::emit_op_get_internal_field):
(JSC::JIT::emit_op_enumerator_next):
(JSC::JIT::emit_op_enumerator_get_by_val):
(JSC::JIT::emit_enumerator_has_propertyImpl):
(JSC::JIT::emitWriteBarrier):
(JSC::JIT::emitSlow_op_get_from_scope): Deleted.
(JSC::JIT::emitPutGlobalVariable): Deleted.
(JSC::JIT::emitPutGlobalVariableIndirect): Deleted.
(JSC::JIT::emitPutClosureVar): Deleted.

  • jit/JITPropertyAccess32_64.cpp:

(JSC::JIT::emit_op_put_getter_by_id):
(JSC::JIT::emit_op_put_setter_by_id):
(JSC::JIT::emit_op_put_getter_setter_by_id):
(JSC::JIT::emit_op_put_getter_by_val):
(JSC::JIT::emit_op_put_setter_by_val):
(JSC::JIT::emit_op_del_by_id):
(JSC::JIT::emit_op_del_by_val):
(JSC::JIT::emitSlow_op_del_by_val):
(JSC::JIT::emitSlow_op_del_by_id):
(JSC::JIT::emit_op_get_by_val):
(JSC::JIT::emitSlow_op_get_by_val):
(JSC::JIT::emit_op_get_private_name):
(JSC::JIT::emitSlow_op_get_private_name):
(JSC::JIT::emit_op_put_private_name):
(JSC::JIT::emitSlow_op_put_private_name):
(JSC::JIT::emit_op_set_private_brand):
(JSC::JIT::emitSlow_op_set_private_brand):
(JSC::JIT::emit_op_check_private_brand):
(JSC::JIT::emitSlow_op_check_private_brand):
(JSC::JIT::emit_op_put_by_val):
(JSC::JIT::emitSlow_op_put_by_val):
(JSC::JIT::emit_op_try_get_by_id):
(JSC::JIT::emitSlow_op_try_get_by_id):
(JSC::JIT::emit_op_get_by_id_direct):
(JSC::JIT::emitSlow_op_get_by_id_direct):
(JSC::JIT::emit_op_get_by_id):
(JSC::JIT::emitSlow_op_get_by_id):
(JSC::JIT::emit_op_get_by_id_with_this):
(JSC::JIT::emitSlow_op_get_by_id_with_this):
(JSC::JIT::emit_op_put_by_id):
(JSC::JIT::emitSlow_op_put_by_id):
(JSC::JIT::emit_op_in_by_id):
(JSC::JIT::emitSlow_op_in_by_id):
(JSC::JIT::emit_op_in_by_val):
(JSC::JIT::emitSlow_op_in_by_val):
(JSC::JIT::emitHasPrivate):
(JSC::JIT::emitHasPrivateSlow):
(JSC::JIT::emitVarInjectionCheck):
(JSC::JIT::emit_op_resolve_scope):
(JSC::JIT::emit_op_get_from_scope):
(JSC::JIT::emitSlow_op_get_from_scope):
(JSC::JIT::emit_op_put_to_scope):
(JSC::JIT::emitSlow_op_put_to_scope):
(JSC::JIT::emit_op_get_from_arguments):
(JSC::JIT::emit_op_get_internal_field):

  • jit/Repatch.cpp:

(JSC::tryCacheGetBy):
(JSC::tryCachePutBy):
(JSC::tryCacheInBy):
(JSC::unlinkCall):

  • jit/ThunkGenerators.cpp:

(JSC::handleExceptionGenerator):
(JSC::popThunkStackPreservesAndHandleExceptionGenerator):

  • jit/ThunkGenerators.h:
  • llint/LLIntSlowPaths.cpp:

(JSC::LLInt::jitCompileAndSetHeuristics):
(JSC::LLInt::LLINT_SLOW_PATH_DECL):

  • llint/LowLevelInterpreter.asm:
  • llint/LowLevelInterpreter32_64.asm:
  • llint/LowLevelInterpreter64.asm:
  • runtime/CacheableIdentifier.h:
  • runtime/CacheableIdentifierInlines.h:

(JSC::CacheableIdentifier::createFromIdentifierOwnedByCodeBlock):

  • runtime/CachedTypes.cpp:

(JSC::CachedCodeBlock::numBinaryArithProfiles const):
(JSC::CachedCodeBlock::numUnaryArithProfiles const):
(JSC::UnlinkedCodeBlock::UnlinkedCodeBlock):
(JSC::CachedCodeBlock<CodeBlockType>::encode):

  • runtime/CommonSlowPaths.cpp:

(JSC::updateArithProfileForUnaryArithOp):

  • runtime/FunctionExecutable.h:
  • runtime/Options.cpp:

(JSC::Options::recomputeDependentOptions):

  • runtime/OptionsList.h:
  • runtime/ScriptExecutable.cpp:

(JSC::ScriptExecutable::prepareForExecutionImpl):

  • wasm/WasmLLIntTierUpCounter.h:

(JSC::Wasm::LLIntTierUpCounter::optimizeAfterWarmUp):
(JSC::Wasm::LLIntTierUpCounter::optimizeSoon):

  • wasm/WasmTierUpCount.cpp:

(JSC::Wasm::TierUpCount::TierUpCount):

  • wasm/WasmTierUpCount.h:

(JSC::Wasm::TierUpCount::optimizeAfterWarmUp):
(JSC::Wasm::TierUpCount::optimizeNextInvocation):
(JSC::Wasm::TierUpCount::optimizeSoon):

Source/WTF:

  • wtf/Bag.h:
  • wtf/Packed.h:

(WTF::PackedAlignedPtr::operator* const):

File:
1 edited

Legend:

Unmodified
Added
Removed
  • trunk/Source/JavaScriptCore/jit/JIT.cpp

    r282239 r283083  
    7070
    7171JIT::JIT(VM& vm, CodeBlock* codeBlock, BytecodeIndex loopOSREntryBytecodeIndex)
    72     : JSInterfaceJIT(&vm, codeBlock)
     72    : JSInterfaceJIT(&vm, nullptr)
    7373    , m_interpreter(vm.interpreter)
    7474    , m_labels(codeBlock ? codeBlock->instructions().size() : 0)
     
    7878    , m_loopOSREntryBytecodeIndex(loopOSREntryBytecodeIndex)
    7979{
     80    m_globalObjectConstant = m_constantPool.add(JITConstantPool::Type::GlobalObject);
     81    m_profiledCodeBlock = codeBlock;
     82    m_unlinkedCodeBlock = codeBlock->unlinkedCodeBlock();
    8083}
    8184
     
    9194
    9295    JumpList skipOptimize;
    93    
    94     skipOptimize.append(branchAdd32(Signed, TrustedImm32(Options::executionCounterIncrementForEntry()), AbsoluteAddress(m_codeBlock->addressOfJITExecuteCounter())));
     96    loadPtr(addressFor(CallFrameSlot::codeBlock), regT0);
     97    skipOptimize.append(branchAdd32(Signed, TrustedImm32(Options::executionCounterIncrementForEntry()), Address(regT0, CodeBlock::offsetOfJITExecuteCounter())));
    9598    ASSERT(!m_bytecodeIndex.offset());
    9699
     
    114117}
    115118
    116 void JIT::emitNotifyWrite(GPRReg pointerToSet)
    117 {
     119void JIT::emitNotifyWriteWatchpoint(GPRReg pointerToSet)
     120{
     121    auto ok = branchTestPtr(Zero, pointerToSet);
    118122    addSlowCase(branch8(NotEqual, Address(pointerToSet, WatchpointSet::offsetOfState()), TrustedImm32(IsInvalidated)));
    119 }
    120 
    121 void JIT::emitVarReadOnlyCheck(ResolveType resolveType)
    122 {
    123     if (resolveType == GlobalVar || resolveType == GlobalVarWithVarInjectionChecks)
    124         addSlowCase(branch8(Equal, AbsoluteAddress(m_codeBlock->globalObject()->varReadOnlyWatchpoint()->addressOfState()), TrustedImm32(IsInvalidated)));
     123    ok.link(this);
     124}
     125
     126void JIT::emitVarReadOnlyCheck(ResolveType resolveType, GPRReg scratchGPR)
     127{
     128    if (resolveType == GlobalVar || resolveType == GlobalVarWithVarInjectionChecks) {
     129        loadGlobalObject(scratchGPR);
     130        loadPtr(Address(scratchGPR, OBJECT_OFFSETOF(JSGlobalObject, m_varReadOnlyWatchpoint)), scratchGPR);
     131        addSlowCase(branch8(Equal, Address(scratchGPR, WatchpointSet::offsetOfState()), TrustedImm32(IsInvalidated)));
     132    }
    125133}
    126134
     
    130138        return;
    131139   
    132     addPtr(TrustedImm32(stackPointerOffsetFor(m_codeBlock) * sizeof(Register)), callFrameRegister, regT0);
     140    addPtr(TrustedImm32(stackPointerOffsetFor(m_unlinkedCodeBlock) * sizeof(Register)), callFrameRegister, regT0);
    133141    Jump ok = branchPtr(Equal, regT0, stackPointerRegister);
    134142    breakpoint();
    135143    ok.link(this);
     144}
     145
     146void JIT::resetSP()
     147{
     148    addPtr(TrustedImm32(stackPointerOffsetFor(m_unlinkedCodeBlock) * sizeof(Register)), callFrameRegister, stackPointerRegister);
     149    checkStackPointerAlignment();
    136150}
    137151
     
    182196}
    183197
     198void JIT::emitPutCodeBlockToFrameInPrologue(GPRReg result)
     199{
     200    RELEASE_ASSERT(m_unlinkedCodeBlock->codeType() == FunctionCode);
     201    emitGetFromCallFrameHeaderPtr(CallFrameSlot::callee, result);
     202    loadPtr(Address(result, JSFunction::offsetOfExecutableOrRareData()), result);
     203    auto hasExecutable = branchTestPtr(Zero, result, CCallHelpers::TrustedImm32(JSFunction::rareDataTag));
     204    loadPtr(Address(result, FunctionRareData::offsetOfExecutable() - JSFunction::rareDataTag), result);
     205    hasExecutable.link(this);
     206    if (m_unlinkedCodeBlock->isConstructor())
     207        loadPtr(Address(result, FunctionExecutable::offsetOfCodeBlockForConstruct()), result);
     208    else
     209        loadPtr(Address(result, FunctionExecutable::offsetOfCodeBlockForCall()), result);
     210
     211    loadPtr(Address(result, ExecutableToCodeBlockEdge::offsetOfCodeBlock()), result);
     212    emitPutToCallFrameHeader(result, CallFrameSlot::codeBlock);
     213
     214#if ASSERT_ENABLED
     215    probeDebug([=] (Probe::Context& ctx) {
     216        CodeBlock* codeBlock = ctx.fp<CallFrame*>()->codeBlock();
     217        RELEASE_ASSERT(codeBlock->jitType() == JITType::BaselineJIT);
     218    });
     219#endif
     220}
     221
    184222void JIT::privateCompileMainPass()
    185223{
    186224    if (JITInternal::verbose)
    187         dataLog("Compiling ", *m_codeBlock, "\n");
     225        dataLog("Compiling ", *m_profiledCodeBlock, "\n");
    188226   
    189227    jitAssertTagsInPlace();
    190228    jitAssertArgumentCountSane();
    191229   
    192     auto& instructions = m_codeBlock->instructions();
    193     unsigned instructionCount = m_codeBlock->instructions().size();
     230    auto& instructions = m_unlinkedCodeBlock->instructions();
     231    unsigned instructionCount = m_unlinkedCodeBlock->instructions().size();
    194232
    195233    m_callLinkInfoIndex = 0;
    196234
    197     VM& vm = m_codeBlock->vm();
    198235    BytecodeIndex startBytecodeIndex(0);
    199     if (m_loopOSREntryBytecodeIndex && (m_codeBlock->inherits<ProgramCodeBlock>(vm) || m_codeBlock->inherits<ModuleProgramCodeBlock>(vm))) {
    200         // We can only do this optimization because we execute ProgramCodeBlock's exactly once.
    201         // This optimization would be invalid otherwise. When the LLInt determines it wants to
    202         // do OSR entry into the baseline JIT in a loop, it will pass in the bytecode offset it
    203         // was executing at when it kicked off our compilation. We only need to compile code for
    204         // anything reachable from that bytecode offset.
    205 
    206         // We only bother building the bytecode graph if it could save time and executable
    207         // memory. We pick an arbitrary offset where we deem this is profitable.
    208         if (m_loopOSREntryBytecodeIndex.offset() >= 200) {
    209             // As a simplification, we don't find all bytecode ranges that are unreachable.
    210             // Instead, we just find the minimum bytecode offset that is reachable, and
    211             // compile code from that bytecode offset onwards.
    212 
    213             BytecodeGraph graph(m_codeBlock, m_codeBlock->instructions());
    214             BytecodeBasicBlock* block = graph.findBasicBlockForBytecodeOffset(m_loopOSREntryBytecodeIndex.offset());
    215             RELEASE_ASSERT(block);
    216 
    217             GraphNodeWorklist<BytecodeBasicBlock*> worklist;
    218             startBytecodeIndex = BytecodeIndex();
    219             worklist.push(block);
    220 
    221             while (BytecodeBasicBlock* block = worklist.pop()) {
    222                 startBytecodeIndex = BytecodeIndex(std::min(startBytecodeIndex.offset(), block->leaderOffset()));
    223                 for (unsigned successorIndex : block->successors())
    224                     worklist.push(&graph[successorIndex]);
    225 
    226                 // Also add catch blocks for bytecodes that throw.
    227                 if (m_codeBlock->numberOfExceptionHandlers()) {
    228                     for (unsigned bytecodeOffset = block->leaderOffset(); bytecodeOffset < block->leaderOffset() + block->totalLength();) {
    229                         auto instruction = instructions.at(bytecodeOffset);
    230                         if (auto* handler = m_codeBlock->handlerForBytecodeIndex(BytecodeIndex(bytecodeOffset)))
    231                             worklist.push(graph.findBasicBlockWithLeaderOffset(handler->target));
    232 
    233                         bytecodeOffset += instruction->size();
    234                     }
    235                 }
    236             }
    237         }
    238     }
    239236
    240237    m_bytecodeCountHavingSlowCase = 0;
     
    279276        unsigned bytecodeOffset = m_bytecodeIndex.offset();
    280277        if (UNLIKELY(Options::traceBaselineJITExecution())) {
    281             CodeBlock* codeBlock = m_codeBlock;
    282278            probeDebug([=] (Probe::Context& ctx) {
     279                CodeBlock* codeBlock = ctx.fp<CallFrame*>()->codeBlock();
    283280                dataLogLn("JIT [", bytecodeOffset, "] ", opcodeNames[opcodeID], " cfr ", RawPointer(ctx.fp()), " @ ", codeBlock);
    284281            });
    285282        }
     283
     284        if (opcodeID != op_catch)
     285            assertStackPointerOffset();
    286286
    287287        switch (opcodeID) {
     
    528528        BytecodeIndex firstTo = m_bytecodeIndex;
    529529
    530         const Instruction* currentInstruction = m_codeBlock->instructions().at(m_bytecodeIndex).ptr();
     530        const Instruction* currentInstruction = m_unlinkedCodeBlock->instructions().at(m_bytecodeIndex).ptr();
    531531       
    532532        if (JITInternal::verbose)
     
    546546        if (UNLIKELY(Options::traceBaselineJITExecution())) {
    547547            unsigned bytecodeOffset = m_bytecodeIndex.offset();
    548             CodeBlock* codeBlock = m_codeBlock;
    549548            probeDebug([=] (Probe::Context& ctx) {
     549                CodeBlock* codeBlock = ctx.fp<CallFrame*>()->codeBlock();
    550550                dataLogLn("JIT [", bytecodeOffset, "] SLOW ", opcodeNames[opcodeID], " cfr ", RawPointer(ctx.fp()), " @ ", codeBlock);
    551551            });
     
    675675}
    676676
     677void JIT::emitMaterializeMetadataAndConstantPoolRegisters()
     678{
     679    loadPtr(addressFor(CallFrameSlot::codeBlock), regT0);
     680    loadPtr(Address(regT0, CodeBlock::offsetOfMetadataTable()), s_metadataGPR);
     681    loadPtr(Address(regT0, CodeBlock::offsetOfJITData()), regT0);
     682    loadPtr(Address(regT0, CodeBlock::JITData::offsetOfJITConstantPool()), s_constantsGPR);
     683}
     684
     685void JIT::emitRestoreCalleeSaves()
     686{
     687    Base::emitRestoreCalleeSavesFor(&RegisterAtOffsetList::llintBaselineCalleeSaveRegisters());
     688}
     689
    677690void JIT::compileAndLinkWithoutFinalizing(JITCompilationEffort effort)
    678691{
    679     DFG::CapabilityLevel level = m_codeBlock->capabilityLevel();
     692    DFG::CapabilityLevel level = m_profiledCodeBlock->capabilityLevel();
    680693    switch (level) {
    681694    case DFG::CannotCompile:
    682695        m_canBeOptimized = false;
    683         m_canBeOptimizedOrInlined = false;
    684696        m_shouldEmitProfiling = false;
    685697        break;
     
    687699    case DFG::CanCompileAndInline:
    688700        m_canBeOptimized = true;
    689         m_canBeOptimizedOrInlined = true;
    690701        m_shouldEmitProfiling = true;
    691702        break;
     
    694705        break;
    695706    }
    696    
    697     switch (m_codeBlock->codeType()) {
    698     case GlobalCode:
    699     case ModuleCode:
    700     case EvalCode:
    701         m_codeBlock->m_shouldAlwaysBeInlined = false;
    702         break;
    703     case FunctionCode:
    704         // We could have already set it to false because we detected an uninlineable call.
    705         // Don't override that observation.
    706         m_codeBlock->m_shouldAlwaysBeInlined &= canInline(level) && DFG::mightInlineFunction(m_codeBlock);
    707         break;
    708     }
    709 
    710     if (m_codeBlock->numberOfUnlinkedSwitchJumpTables() || m_codeBlock->numberOfUnlinkedStringSwitchJumpTables()) {
    711         ConcurrentJSLocker locker(m_codeBlock->m_lock);
    712         if (m_codeBlock->numberOfUnlinkedSwitchJumpTables())
    713             m_codeBlock->ensureJITData(locker).m_switchJumpTables = FixedVector<SimpleJumpTable>(m_codeBlock->numberOfUnlinkedSwitchJumpTables());
    714         if (m_codeBlock->numberOfUnlinkedStringSwitchJumpTables())
    715             m_codeBlock->ensureJITData(locker).m_stringSwitchJumpTables = FixedVector<StringJumpTable>(m_codeBlock->numberOfUnlinkedStringSwitchJumpTables());
    716     }
    717 
    718     if (UNLIKELY(Options::dumpDisassembly() || (m_vm->m_perBytecodeProfiler && Options::disassembleBaselineForProfiler())))
    719         m_disassembler = makeUnique<JITDisassembler>(m_codeBlock);
     707
     708    if (m_unlinkedCodeBlock->numberOfUnlinkedSwitchJumpTables() || m_unlinkedCodeBlock->numberOfUnlinkedStringSwitchJumpTables()) {
     709        if (m_unlinkedCodeBlock->numberOfUnlinkedSwitchJumpTables())
     710            m_switchJumpTables = FixedVector<SimpleJumpTable>(m_unlinkedCodeBlock->numberOfUnlinkedSwitchJumpTables());
     711        if (m_unlinkedCodeBlock->numberOfUnlinkedStringSwitchJumpTables())
     712            m_stringSwitchJumpTables = FixedVector<StringJumpTable>(m_unlinkedCodeBlock->numberOfUnlinkedStringSwitchJumpTables());
     713    }
     714
     715    if (UNLIKELY(Options::dumpDisassembly() || (m_vm->m_perBytecodeProfiler && Options::disassembleBaselineForProfiler()))) {
     716        // FIXME: build a disassembler off of UnlinkedCodeBlock.
     717        m_disassembler = makeUnique<JITDisassembler>(m_profiledCodeBlock);
     718    }
    720719    if (UNLIKELY(m_vm->m_perBytecodeProfiler)) {
     720        // FIXME: build profiler disassembler off UnlinkedCodeBlock.
    721721        m_compilation = adoptRef(
    722722            new Profiler::Compilation(
    723                 m_vm->m_perBytecodeProfiler->ensureBytecodesFor(m_codeBlock),
     723                m_vm->m_perBytecodeProfiler->ensureBytecodesFor(m_profiledCodeBlock),
    724724                Profiler::Baseline));
    725         m_compilation->addProfiledBytecodes(*m_vm->m_perBytecodeProfiler, m_codeBlock);
     725        m_compilation->addProfiledBytecodes(*m_vm->m_perBytecodeProfiler, m_profiledCodeBlock);
    726726    }
    727727   
     
    743743
    744744    emitFunctionPrologue();
    745     emitPutToCallFrameHeader(m_codeBlock, CallFrameSlot::codeBlock);
     745    if (m_unlinkedCodeBlock->codeType() == FunctionCode)
     746        emitPutCodeBlockToFrameInPrologue();
    746747
    747748    Label beginLabel(this);
    748749
    749     int frameTopOffset = stackPointerOffsetFor(m_codeBlock) * sizeof(Register);
     750    int frameTopOffset = stackPointerOffsetFor(m_unlinkedCodeBlock) * sizeof(Register);
    750751    unsigned maxFrameSize = -frameTopOffset;
    751752    addPtr(TrustedImm32(frameTopOffset), callFrameRegister, regT1);
     
    758759    checkStackPointerAlignment();
    759760
    760     emitSaveCalleeSaves();
     761    emitSaveCalleeSavesFor(&RegisterAtOffsetList::llintBaselineCalleeSaveRegisters());
    761762    emitMaterializeTagCheckRegisters();
    762 
    763     if (m_codeBlock->codeType() == FunctionCode) {
     763    emitMaterializeMetadataAndConstantPoolRegisters();
     764
     765    if (m_unlinkedCodeBlock->codeType() == FunctionCode) {
    764766        ASSERT(!m_bytecodeIndex);
    765         if (shouldEmitProfiling()) {
    766             for (unsigned argument = 0; argument < m_codeBlock->numParameters(); ++argument) {
     767        if (shouldEmitProfiling() && (!m_unlinkedCodeBlock->isConstructor() || m_unlinkedCodeBlock->numParameters() > 1)) {
     768            emitGetFromCallFrameHeaderPtr(CallFrameSlot::codeBlock, regT2);
     769            loadPtr(Address(regT2, CodeBlock::offsetOfArgumentValueProfiles() + FixedVector<ValueProfile>::offsetOfStorage()), regT2);
     770
     771            for (unsigned argument = 0; argument < m_unlinkedCodeBlock->numParameters(); ++argument) {
    767772                // If this is a constructor, then we want to put in a dummy profiling site (to
    768773                // keep things consistent) but we don't actually want to record the dummy value.
    769                 if (m_codeBlock->isConstructor() && !argument)
     774                if (m_unlinkedCodeBlock->isConstructor() && !argument)
    770775                    continue;
    771776                int offset = CallFrame::argumentOffsetIncludingThis(argument) * static_cast<int>(sizeof(Register));
     
    778783                load32(Address(callFrameRegister, offset + OBJECT_OFFSETOF(JSValue, u.asBits.tag)), resultRegs.tagGPR());
    779784#endif
    780                 emitValueProfilingSite(m_codeBlock->valueProfileForArgument(argument), resultRegs);
     785                storeValue(resultRegs, Address(regT2, argument * sizeof(ValueProfile) + ValueProfile::offsetOfFirstBucket()));
    781786            }
    782787        }
    783788    }
    784789   
    785     RELEASE_ASSERT(!JITCode::isJIT(m_codeBlock->jitType()));
     790    RELEASE_ASSERT(!JITCode::isJIT(m_profiledCodeBlock->jitType()));
    786791
    787792    if (UNLIKELY(sizeMarker))
     
    800805    if (maxFrameExtentForSlowPathCall)
    801806        addPtr(TrustedImm32(-static_cast<int32_t>(maxFrameExtentForSlowPathCall)), stackPointerRegister);
    802     callOperationWithCallFrameRollbackOnException(operationThrowStackOverflowError, m_codeBlock);
     807    emitGetFromCallFrameHeaderPtr(CallFrameSlot::codeBlock, regT0);
     808    callOperationWithCallFrameRollbackOnException(operationThrowStackOverflowError, regT0);
    803809
    804810    // If the number of parameters is 1, we never require arity fixup.
    805     bool requiresArityFixup = m_codeBlock->m_numParameters != 1;
    806     if (m_codeBlock->codeType() == FunctionCode && requiresArityFixup) {
     811    bool requiresArityFixup = m_unlinkedCodeBlock->numParameters() != 1;
     812    if (m_unlinkedCodeBlock->codeType() == FunctionCode && requiresArityFixup) {
    807813        m_arityCheck = label();
    808         store8(TrustedImm32(0), &m_codeBlock->m_shouldAlwaysBeInlined);
     814
    809815        emitFunctionPrologue();
    810         emitPutToCallFrameHeader(m_codeBlock, CallFrameSlot::codeBlock);
     816        emitPutCodeBlockToFrameInPrologue(regT0);
     817        store8(TrustedImm32(0), Address(regT0, CodeBlock::offsetOfShouldAlwaysBeInlined()));
    811818
    812819        load32(payloadFor(CallFrameSlot::argumentCountIncludingThis), regT1);
    813         branch32(AboveOrEqual, regT1, TrustedImm32(m_codeBlock->m_numParameters)).linkTo(beginLabel, this);
     820        branch32(AboveOrEqual, regT1, TrustedImm32(m_unlinkedCodeBlock->numParameters())).linkTo(beginLabel, this);
    814821
    815822        m_bytecodeIndex = BytecodeIndex(0);
     
    817824        if (maxFrameExtentForSlowPathCall)
    818825            addPtr(TrustedImm32(-static_cast<int32_t>(maxFrameExtentForSlowPathCall)), stackPointerRegister);
    819         callOperationWithCallFrameRollbackOnException(m_codeBlock->isConstructor() ? operationConstructArityCheck : operationCallArityCheck, m_codeBlock->globalObject());
     826        loadPtr(Address(regT0, CodeBlock::offsetOfGlobalObject()), argumentGPR0);
     827        callOperationWithCallFrameRollbackOnException(m_unlinkedCodeBlock->isConstructor() ? operationConstructArityCheck : operationCallArityCheck, argumentGPR0);
    820828        if (maxFrameExtentForSlowPathCall)
    821829            addPtr(TrustedImm32(maxFrameExtentForSlowPathCall), stackPointerRegister);
     
    840848    m_pcToCodeOriginMapBuilder.appendItem(label(), PCToCodeOriginMapBuilder::defaultCodeOrigin());
    841849
    842     m_linkBuffer = std::unique_ptr<LinkBuffer>(new LinkBuffer(*this, m_codeBlock, LinkBuffer::Profile::BaselineJIT, effort));
     850    m_linkBuffer = std::unique_ptr<LinkBuffer>(new LinkBuffer(*this, m_unlinkedCodeBlock, LinkBuffer::Profile::BaselineJIT, effort));
    843851    link();
    844852}
     
    859867        case SwitchRecord::Immediate:
    860868        case SwitchRecord::Character: {
    861             const UnlinkedSimpleJumpTable& unlinkedTable = m_codeBlock->unlinkedSwitchJumpTable(tableIndex);
    862             SimpleJumpTable& linkedTable = m_codeBlock->switchJumpTable(tableIndex);
     869            const UnlinkedSimpleJumpTable& unlinkedTable = m_unlinkedCodeBlock->unlinkedSwitchJumpTable(tableIndex);
     870            SimpleJumpTable& linkedTable = m_switchJumpTables[tableIndex];
    863871            linkedTable.m_ctiDefault = patchBuffer.locationOf<JSSwitchPtrTag>(m_labels[bytecodeOffset + record.defaultOffset]);
    864872            for (unsigned j = 0; j < unlinkedTable.m_branchOffsets.size(); ++j) {
     
    872880
    873881        case SwitchRecord::String: {
    874             const UnlinkedStringJumpTable& unlinkedTable = m_codeBlock->unlinkedStringSwitchJumpTable(tableIndex);
    875             StringJumpTable& linkedTable = m_codeBlock->stringSwitchJumpTable(tableIndex);
     882            const UnlinkedStringJumpTable& unlinkedTable = m_unlinkedCodeBlock->unlinkedStringSwitchJumpTable(tableIndex);
     883            StringJumpTable& linkedTable = m_stringSwitchJumpTables[tableIndex];
    876884            auto ctiDefault = patchBuffer.locationOf<JSSwitchPtrTag>(m_labels[bytecodeOffset + record.defaultOffset]);
    877885            for (auto& location : unlinkedTable.m_offsetTable.values()) {
     
    907915    }
    908916
     917#if USE(JSVALUE64)
     918    auto finalizeICs = [&] (auto& generators) {
     919        for (auto& gen : generators) {
     920            gen.m_unlinkedStubInfo->start = patchBuffer.locationOf<JITStubRoutinePtrTag>(gen.m_start);
     921            gen.m_unlinkedStubInfo->doneLocation = patchBuffer.locationOf<JSInternalPtrTag>(gen.m_done);
     922            gen.m_unlinkedStubInfo->slowPathStartLocation = patchBuffer.locationOf<JITStubRoutinePtrTag>(gen.m_slowPathBegin);
     923        }
     924    };
     925
     926    finalizeICs(m_getByIds);
     927    finalizeICs(m_getByVals);
     928    finalizeICs(m_getByIdsWithThis);
     929    finalizeICs(m_putByIds);
     930    finalizeICs(m_putByVals);
     931    finalizeICs(m_delByIds);
     932    finalizeICs(m_delByVals);
     933    finalizeICs(m_inByIds);
     934    finalizeICs(m_inByVals);
     935    finalizeICs(m_instanceOfs);
     936    finalizeICs(m_privateBrandAccesses);
     937#else
    909938    finalizeInlineCaches(m_getByIds, patchBuffer);
    910939    finalizeInlineCaches(m_getByVals, patchBuffer);
     
    918947    finalizeInlineCaches(m_instanceOfs, patchBuffer);
    919948    finalizeInlineCaches(m_privateBrandAccesses, patchBuffer);
     949#endif
    920950
    921951    for (auto& compilationInfo : m_callCompilationInfo) {
     952#if USE(JSVALUE64)
     953        UnlinkedCallLinkInfo& info = *compilationInfo.unlinkedCallLinkInfo;
     954        info.doneLocation = patchBuffer.locationOf<JSInternalPtrTag>(compilationInfo.doneLocation);
     955#else
    922956        CallLinkInfo& info = *compilationInfo.callLinkInfo;
    923957        info.setCodeLocations(
    924958            patchBuffer.locationOf<JSInternalPtrTag>(compilationInfo.slowPathStart),
    925959            patchBuffer.locationOf<JSInternalPtrTag>(compilationInfo.doneLocation));
    926     }
    927 
    928     {
    929         JITCodeMapBuilder jitCodeMapBuilder;
    930         for (unsigned bytecodeOffset = 0; bytecodeOffset < m_labels.size(); ++bytecodeOffset) {
    931             if (m_labels[bytecodeOffset].isSet())
    932                 jitCodeMapBuilder.append(BytecodeIndex(bytecodeOffset), patchBuffer.locationOf<JSEntryPtrTag>(m_labels[bytecodeOffset]));
    933         }
    934         m_codeBlock->setJITCodeMap(jitCodeMapBuilder.finalize());
     960#endif
     961
     962    }
     963
     964    JITCodeMapBuilder jitCodeMapBuilder;
     965    for (unsigned bytecodeOffset = 0; bytecodeOffset < m_labels.size(); ++bytecodeOffset) {
     966        if (m_labels[bytecodeOffset].isSet())
     967            jitCodeMapBuilder.append(BytecodeIndex(bytecodeOffset), patchBuffer.locationOf<JSEntryPtrTag>(m_labels[bytecodeOffset]));
    935968    }
    936969
     
    941974
    942975    if (UNLIKELY(m_compilation)) {
     976        // FIXME: should we make the bytecode profiler know about UnlinkedCodeBlock?
    943977        if (Options::disassembleBaselineForProfiler())
    944978            m_disassembler->reportToProfiler(m_compilation.get(), patchBuffer);
    945         m_vm->m_perBytecodeProfiler->addCompilation(m_codeBlock, *m_compilation);
     979        m_vm->m_perBytecodeProfiler->addCompilation(m_profiledCodeBlock, *m_compilation);
    946980    }
    947981
     
    949983        m_pcToCodeOriginMap = makeUnique<PCToCodeOriginMap>(WTFMove(m_pcToCodeOriginMapBuilder), patchBuffer);
    950984   
     985    // FIXME: Make a version of CodeBlockWithJITType that knows about UnlinkedCodeBlock.
    951986    CodeRef<JSEntryPtrTag> result = FINALIZE_CODE(
    952987        patchBuffer, JSEntryPtrTag,
    953         "Baseline JIT code for %s", toCString(CodeBlockWithJITType(m_codeBlock, JITType::BaselineJIT)).data());
     988        "Baseline JIT code for %s", toCString(CodeBlockWithJITType(m_profiledCodeBlock, JITType::BaselineJIT)).data());
    954989   
    955990    MacroAssemblerCodePtr<JSEntryPtrTag> withArityCheck = patchBuffer.locationOf<JSEntryPtrTag>(m_arityCheck);
    956     m_jitCode = adoptRef(*new DirectJITCode(result, withArityCheck, JITType::BaselineJIT));
     991    m_jitCode = adoptRef(*new BaselineJITCode(result, withArityCheck));
     992
     993    m_jitCode->m_unlinkedCalls = WTFMove(m_unlinkedCalls);
     994    m_jitCode->m_evalCallLinkInfos = WTFMove(m_evalCallLinkInfos);
     995    m_jitCode->m_unlinkedStubInfos = WTFMove(m_unlinkedStubInfos);
     996    m_jitCode->m_switchJumpTables = WTFMove(m_switchJumpTables);
     997    m_jitCode->m_stringSwitchJumpTables = WTFMove(m_stringSwitchJumpTables);
     998    m_jitCode->m_jitCodeMap = jitCodeMapBuilder.finalize();
     999    m_jitCode->adoptMathICs(m_mathICs);
     1000    m_jitCode->m_constantPool = WTFMove(m_constantPool);
     1001#if USE(JSVALUE64)
     1002    m_jitCode->m_isShareable = m_isShareable;
     1003#else
     1004    m_jitCode->m_isShareable = false;
     1005#endif
    9571006
    9581007    if (JITInternal::verbose)
    959         dataLogF("JIT generated code for %p at [%p, %p).\n", m_codeBlock, result.executableMemory()->start().untaggedPtr(), result.executableMemory()->end().untaggedPtr());
    960 }
    961 
    962 CompilationResult JIT::finalizeOnMainThread()
     1008        dataLogF("JIT generated code for %p at [%p, %p).\n", m_unlinkedCodeBlock, result.executableMemory()->start().untaggedPtr(), result.executableMemory()->end().untaggedPtr());
     1009}
     1010
     1011CompilationResult JIT::finalizeOnMainThread(CodeBlock* codeBlock)
    9631012{
    9641013    RELEASE_ASSERT(!isCompilationThread());
     
    9691018    m_linkBuffer->runMainThreadFinalizationTasks();
    9701019
    971     {
    972         ConcurrentJSLocker locker(m_codeBlock->m_lock);
    973         m_codeBlock->shrinkToFit(locker, CodeBlock::ShrinkMode::LateShrink);
    974     }
    975 
    976     for (size_t i = 0; i < m_codeBlock->numberOfExceptionHandlers(); ++i) {
    977         HandlerInfo& handler = m_codeBlock->exceptionHandler(i);
    978         // FIXME: <rdar://problem/39433318>.
    979         handler.nativeCode = m_codeBlock->jitCodeMap().find(BytecodeIndex(handler.target)).retagged<ExceptionHandlerPtrTag>();
    980     }
    981 
    9821020    if (m_pcToCodeOriginMap)
    983         m_codeBlock->setPCToCodeOriginMap(WTFMove(m_pcToCodeOriginMap));
     1021        m_jitCode->m_pcToCodeOriginMap = WTFMove(m_pcToCodeOriginMap);
    9841022
    9851023    m_vm->machineCodeBytesPerBytecodeWordForBaselineJIT->add(
    9861024        static_cast<double>(m_jitCode->size()) /
    987         static_cast<double>(m_codeBlock->instructionsSize()));
    988 
    989     m_codeBlock->setJITCode(m_jitCode.releaseNonNull());
     1025        static_cast<double>(m_unlinkedCodeBlock->instructionsSize()));
     1026
     1027    codeBlock->setupWithUnlinkedBaselineCode(m_jitCode.releaseNonNull());
    9901028
    9911029    return CompilationSuccessful;
     
    9991037}
    10001038
    1001 CompilationResult JIT::privateCompile(JITCompilationEffort effort)
     1039CompilationResult JIT::privateCompile(CodeBlock* codeBlock, JITCompilationEffort effort)
    10021040{
    10031041    doMainThreadPreparationBeforeCompile();
    10041042    compileAndLinkWithoutFinalizing(effort);
    1005     return finalizeOnMainThread();
     1043    return finalizeOnMainThread(codeBlock);
    10061044}
    10071045
     
    10431081}
    10441082
     1083unsigned JIT::frameRegisterCountFor(UnlinkedCodeBlock* codeBlock)
     1084{
     1085    ASSERT(static_cast<unsigned>(codeBlock->numCalleeLocals()) == WTF::roundUpToMultipleOf(stackAlignmentRegisters(), static_cast<unsigned>(codeBlock->numCalleeLocals())));
     1086
     1087    return roundLocalRegisterCountForFramePointerOffset(codeBlock->numCalleeLocals() + maxFrameExtentForSlowPathCallInRegisters);
     1088}
     1089
    10451090unsigned JIT::frameRegisterCountFor(CodeBlock* codeBlock)
    10461091{
    1047     ASSERT(static_cast<unsigned>(codeBlock->numCalleeLocals()) == WTF::roundUpToMultipleOf(stackAlignmentRegisters(), static_cast<unsigned>(codeBlock->numCalleeLocals())));
    1048 
    1049     return roundLocalRegisterCountForFramePointerOffset(codeBlock->numCalleeLocals() + maxFrameExtentForSlowPathCallInRegisters);
     1092    return frameRegisterCountFor(codeBlock->unlinkedCodeBlock());
     1093}
     1094
     1095int JIT::stackPointerOffsetFor(UnlinkedCodeBlock* codeBlock)
     1096{
     1097    return virtualRegisterForLocal(frameRegisterCountFor(codeBlock) - 1).offset();
    10501098}
    10511099
    10521100int JIT::stackPointerOffsetFor(CodeBlock* codeBlock)
    10531101{
    1054     return virtualRegisterForLocal(frameRegisterCountFor(codeBlock) - 1).offset();
     1102    return stackPointerOffsetFor(codeBlock->unlinkedCodeBlock());
    10551103}
    10561104
Note: See TracChangeset for help on using the changeset viewer.