Ignore:
Timestamp:
Sep 27, 2021, 12:52:48 AM (4 years ago)
Author:
[email protected]
Message:

Build an unlinked baseline JIT
https://bugs.webkit.org/show_bug.cgi?id=229223
<rdar://problem/82321772>

Reviewed by Yusuke Suzuki.

Source/JavaScriptCore:

This patch adds an "unlinked" baseline JIT to JSVALUE64 platforms. The JIT
code produced by this baseline JIT can be shared between all CodeBlocks that
share an UnlinkedCodeBlock. The benefit of this is, if we're creating a CodeBlock
from an UnlinkedCodeBlock that already compiled an unlinked baseline JIT
instance, this new CodeBlock just starts off executing in the baseline JIT
"for free".

To make this work, the code we emit now needs to be independent of a specific
CodeBlock instance. We use a CodeBlock instance for minimal profiling information
when compiling, but otherwise, the code is tied to the UnlinkedCodeBlock. When
we need CodeBlock specific information, we load it. This usually means things
like we'll load things from the Metadata dynamically. This patch also adds a
"linked constant pool" concept, and anytime we instantiate such a CodeBlock,
we also need to instantiate this "linked constant pool". This contains things
like our inline cache data structures (StructureStubInfo*), JSGlobalObject*,
etc.

Unlinked baseline JIT always runs ICs in the "data" mode. To make this work, I
made data ICs work on x86_64. To do this, we no longer call/ret to the IC.
Instead, we jump to the IC, and the IC jumps back by loading the "done" location
from the StructureStubInfo dynamically. This simplifies the design to not be
based on the arm64 calling convention, and keeps the same performance characteristics.

This patch also adds a new version of InlineAccess that is only used in baseline
JIT (for now). In the future, we can make the DFG/FTL also use this for Data
ICs. But we don't need to do that yet since those tiers don't use data ICs by
default. The baseline JIT now has a pure data IC approach to InlineAccess. So
instead of repatching code, we repatch fields we load dynamically.

This patch also cleans up a few things in OSR exit, where both DFG/FTL were
storing callee saves to the callee saves buffer in a weird place, and separate
from one another. I noticed this code can be simplified if we just store
callee saves at the end of the OSR exit handler, and from common JIT emission
code.

This patch also fixes a bug where we could end up with the wrong (and always
more negative) SP in the baseline JIT. This could happen when we OSR exit
from an inlined getter/setter. The OSR exit code puts the return PC when
returning to the call site of the getter/setter to be the inline cache's
"done location". However, this "done location" didn't used to restore SP.
This patch conservatively makes it so that we restore the SP at these sites.

This is measured as a 1% speedup on Speedometer2.

  • CMakeLists.txt:
  • JavaScriptCore.xcodeproj/project.pbxproj:
  • Sources.txt:
  • bytecode/AccessCase.cpp:

(JSC::AccessCase::fromStructureStubInfo):
(JSC::AccessCase::generateImpl):

  • bytecode/BytecodeList.rb:
  • bytecode/BytecodeOperandsForCheckpoint.h:

(JSC::valueProfileOffsetFor):

  • bytecode/CallLinkInfo.cpp:

(JSC::CallLinkInfo::fastPathStart):
(JSC::CallLinkInfo::emitFastPathImpl):
(JSC::CallLinkInfo::emitFastPath):
(JSC::CallLinkInfo::emitTailCallFastPath):
(JSC::CallLinkInfo::emitDataICFastPath):
(JSC::CallLinkInfo::emitTailCallDataICFastPath):
(JSC::CallLinkInfo::emitDataICSlowPath):
(JSC::CallLinkInfo::initializeDataIC):
(JSC::CallLinkInfo::emitDirectFastPath):
(JSC::CallLinkInfo::emitDirectTailCallFastPath):

  • bytecode/CallLinkInfo.h:

(JSC::CallLinkInfo::offsetOfMaxArgumentCountIncludingThis):
(JSC::CallLinkInfo::slowStub): Deleted.
(JSC::CallLinkInfo::addressOfMaxArgumentCountIncludingThis): Deleted.

  • bytecode/CodeBlock.cpp:

(JSC::CodeBlock::CodeBlock):
(JSC::CodeBlock::finishCreation):
(JSC::CodeBlock::setupWithUnlinkedBaselineCode):
(JSC::CodeBlock::isConstantOwnedByUnlinkedCodeBlock const):
(JSC::CodeBlock::setConstantRegisters):
(JSC::CodeBlock::finalizeJITInlineCaches):
(JSC::CodeBlock::finalizeUnconditionally):
(JSC::CodeBlock::frameRegisterCount):
(JSC::CodeBlock::binaryArithProfileForPC):
(JSC::CodeBlock::unaryArithProfileForPC):
(JSC::CodeBlock::findPC):
(JSC::CodeBlock::jitSoon):
(JSC::CodeBlock::jitNextInvocation):
(JSC::CodeBlock::dumpMathICStats):
(JSC::CodeBlock::finalizeBaselineJITInlineCaches): Deleted.
(JSC::CodeBlock::addJITAddIC): Deleted.
(JSC::CodeBlock::addJITMulIC): Deleted.
(JSC::CodeBlock::addJITSubIC): Deleted.
(JSC::CodeBlock::addJITNegIC): Deleted.
(JSC::CodeBlock::setPCToCodeOriginMap): Deleted.
(JSC::CodeBlock::thresholdForJIT): Deleted.
(JSC::CodeBlock::jitAfterWarmUp): Deleted.

  • bytecode/CodeBlock.h:

(JSC::CodeBlock::JITData::offsetOfJITConstantPool):
(JSC::CodeBlock::offsetOfJITData):
(JSC::CodeBlock::offsetOfArgumentValueProfiles):
(JSC::CodeBlock::offsetOfConstantsVectorBuffer):
(JSC::CodeBlock::baselineJITConstantPool):
(JSC::CodeBlock::checkIfJITThresholdReached):
(JSC::CodeBlock::dontJITAnytimeSoon):
(JSC::CodeBlock::llintExecuteCounter const):
(JSC::CodeBlock::offsetOfDebuggerRequests):
(JSC::CodeBlock::offsetOfShouldAlwaysBeInlined):
(JSC::CodeBlock::loopHintsAreEligibleForFuzzingEarlyReturn):
(JSC::CodeBlock::addressOfNumParameters): Deleted.
(JSC::CodeBlock::isKnownCell): Deleted.
(JSC::CodeBlock::addMathIC): Deleted.
(JSC::CodeBlock::setJITCodeMap): Deleted.
(JSC::CodeBlock::jitCodeMap): Deleted.
(JSC::CodeBlock::switchJumpTable): Deleted.
(JSC::CodeBlock::stringSwitchJumpTable): Deleted.

  • bytecode/CodeBlockInlines.h:

(JSC::CodeBlock::forEachValueProfile):
(JSC::CodeBlock::jitCodeMap):
(JSC::CodeBlock::baselineSwitchJumpTable):
(JSC::CodeBlock::baselineStringSwitchJumpTable):
(JSC::CodeBlock::dfgSwitchJumpTable):
(JSC::CodeBlock::dfgStringSwitchJumpTable):

  • bytecode/ExecutableToCodeBlockEdge.h:
  • bytecode/ExecutionCounter.cpp:

(JSC::ExecutionCounter<countingVariant>::setThreshold):

  • bytecode/ExecutionCounter.h:

(JSC::ExecutionCounter::clippedThreshold):

  • bytecode/GetByIdMetadata.h:

(JSC::GetByIdModeMetadataArrayLength::offsetOfArrayProfile):
(JSC::GetByIdModeMetadata::offsetOfMode):

  • bytecode/GetByStatus.cpp:

(JSC::GetByStatus::computeForStubInfoWithoutExitSiteFeedback):

  • bytecode/GetterSetterAccessCase.cpp:

(JSC::GetterSetterAccessCase::emitDOMJITGetter):

  • bytecode/InByStatus.cpp:

(JSC::InByStatus::computeForStubInfoWithoutExitSiteFeedback):

  • bytecode/InlineAccess.cpp:

(JSC::InlineAccess::generateSelfPropertyAccess):
(JSC::InlineAccess::canGenerateSelfPropertyReplace):
(JSC::InlineAccess::generateSelfPropertyReplace):
(JSC::InlineAccess::isCacheableArrayLength):
(JSC::InlineAccess::generateArrayLength):
(JSC::InlineAccess::isCacheableStringLength):
(JSC::InlineAccess::generateStringLength):
(JSC::InlineAccess::generateSelfInAccess):
(JSC::InlineAccess::rewireStubAsJumpInAccess):
(JSC::InlineAccess::resetStubAsJumpInAccess):

  • bytecode/InlineAccess.h:
  • bytecode/IterationModeMetadata.h:

(JSC::IterationModeMetadata::offsetOfSeenModes):

  • bytecode/LLIntCallLinkInfo.h:

(JSC::LLIntCallLinkInfo::offsetOfArrayProfile):

  • bytecode/Opcode.h:
  • bytecode/PolymorphicAccess.cpp:

(JSC::AccessGenerationState::succeed):
(JSC::AccessGenerationState::calculateLiveRegistersForCallAndExceptionHandling):
(JSC::AccessGenerationState::preserveLiveRegistersToStackForCallWithoutExceptions):
(JSC::PolymorphicAccess::regenerate):

  • bytecode/PolymorphicAccess.h:

(JSC::AccessGenerationState::preserveLiveRegistersToStackForCallWithoutExceptions): Deleted.

  • bytecode/PutByStatus.cpp:

(JSC::PutByStatus::computeForStubInfo):

  • bytecode/StructureStubInfo.cpp:

(JSC::StructureStubInfo::initGetByIdSelf):
(JSC::StructureStubInfo::initPutByIdReplace):
(JSC::StructureStubInfo::initInByIdSelf):
(JSC::StructureStubInfo::addAccessCase):
(JSC::StructureStubInfo::reset):
(JSC::StructureStubInfo::visitWeakReferences):
(JSC::StructureStubInfo::propagateTransitions):
(JSC::StructureStubInfo::initializeFromUnlinkedStructureStubInfo):

  • bytecode/StructureStubInfo.h:

(JSC::StructureStubInfo::offsetOfByIdSelfOffset):
(JSC::StructureStubInfo::offsetOfInlineAccessBaseStructure):
(JSC::StructureStubInfo::inlineAccessBaseStructure):
(JSC::StructureStubInfo::offsetOfDoneLocation):

  • bytecode/SuperSampler.cpp:

(JSC::printSuperSamplerState):

  • bytecode/UnlinkedCodeBlock.cpp:

(JSC::UnlinkedCodeBlock::UnlinkedCodeBlock):
(JSC::UnlinkedCodeBlock::hasIdentifier):
(JSC::UnlinkedCodeBlock::thresholdForJIT):
(JSC::UnlinkedCodeBlock::allocateSharedProfiles):

  • bytecode/UnlinkedCodeBlock.h:

(JSC::UnlinkedCodeBlock::constantRegister):
(JSC::UnlinkedCodeBlock::instructionAt const):
(JSC::UnlinkedCodeBlock::bytecodeOffset):
(JSC::UnlinkedCodeBlock::instructionsSize const):
(JSC::UnlinkedCodeBlock::loopHintsAreEligibleForFuzzingEarlyReturn):
(JSC::UnlinkedCodeBlock::outOfLineJumpOffset):
(JSC::UnlinkedCodeBlock::binaryArithProfile):
(JSC::UnlinkedCodeBlock::unaryArithProfile):
(JSC::UnlinkedCodeBlock::llintExecuteCounter):

  • bytecode/UnlinkedMetadataTable.h:

(JSC::UnlinkedMetadataTable::offsetInMetadataTable):

  • bytecode/ValueProfile.h:

(JSC::ValueProfileBase::ValueProfileBase):
(JSC::ValueProfileBase::clearBuckets):
(JSC::ValueProfile::offsetOfFirstBucket):

  • dfg/DFGCommonData.h:
  • dfg/DFGJITCode.cpp:
  • dfg/DFGJITCode.h:
  • dfg/DFGJITCompiler.cpp:

(JSC::DFG::JITCompiler::link):

  • dfg/DFGOSREntry.cpp:

(JSC::DFG::prepareOSREntry):

  • dfg/DFGOSRExit.cpp:

(JSC::DFG::OSRExit::compileExit):

  • dfg/DFGOSRExitCompilerCommon.cpp:

(JSC::DFG::handleExitCounts):
(JSC::DFG::callerReturnPC):
(JSC::DFG::reifyInlinedCallFrames):
(JSC::DFG::adjustAndJumpToTarget):

  • dfg/DFGOperations.cpp:

(JSC::DFG::JSC_DEFINE_JIT_OPERATION):

  • dfg/DFGSpeculativeJIT.cpp:

(JSC::DFG::SpeculativeJIT::compilePutPrivateName):
(JSC::DFG::SpeculativeJIT::compileValueAdd):
(JSC::DFG::SpeculativeJIT::compileValueSub):
(JSC::DFG::SpeculativeJIT::compileValueNegate):
(JSC::DFG::SpeculativeJIT::compileValueMul):
(JSC::DFG::SpeculativeJIT::compileLogShadowChickenTail):

  • dfg/DFGSpeculativeJIT32_64.cpp:

(JSC::DFG::SpeculativeJIT::emitCall):

  • dfg/DFGSpeculativeJIT64.cpp:

(JSC::DFG::SpeculativeJIT::emitCall):

  • ftl/FTLCompile.cpp:

(JSC::FTL::compile):

  • ftl/FTLJITCode.h:
  • ftl/FTLLink.cpp:

(JSC::FTL::link):

  • ftl/FTLLowerDFGToB3.cpp:

(JSC::FTL::DFG::LowerDFGToB3::addMathIC):
(JSC::FTL::DFG::LowerDFGToB3::compileUnaryMathIC):
(JSC::FTL::DFG::LowerDFGToB3::compileBinaryMathIC):
(JSC::FTL::DFG::LowerDFGToB3::compilePutPrivateName):
(JSC::FTL::DFG::LowerDFGToB3::compileCompareStrictEq):

  • ftl/FTLOSRExitCompiler.cpp:

(JSC::FTL::compileStub):

  • generator/Metadata.rb:
  • jit/AssemblyHelpers.cpp:

(JSC::AssemblyHelpers::storeProperty):
(JSC::AssemblyHelpers::emitVirtualCall):
(JSC::AssemblyHelpers::emitVirtualCallWithoutMovingGlobalObject):

  • jit/AssemblyHelpers.h:

(JSC::AssemblyHelpers::copyCalleeSavesToEntryFrameCalleeSavesBuffer):

  • jit/BaselineJITCode.cpp: Added.

(JSC::MathICHolder::addJITAddIC):
(JSC::MathICHolder::addJITMulIC):
(JSC::MathICHolder::addJITSubIC):
(JSC::MathICHolder::addJITNegIC):
(JSC::MathICHolder::adoptMathICs):
(JSC::BaselineJITCode::BaselineJITCode):
(JSC::BaselineJITCode::~BaselineJITCode):

  • jit/BaselineJITCode.h: Added.

(JSC::JITConstantPool::add):
(JSC::JITConstantPool::size const):
(JSC::JITConstantPool::at const):

  • jit/BaselineJITPlan.cpp:

(JSC::BaselineJITPlan::finalize):

  • jit/CCallHelpers.cpp:

(JSC::CCallHelpers::logShadowChickenTailPacketImpl):
(JSC::CCallHelpers::logShadowChickenTailPacket):

  • jit/CCallHelpers.h:
  • jit/CallFrameShuffleData.cpp:

(JSC::CallFrameShuffleData::setupCalleeSaveRegisters):

  • jit/CallFrameShuffleData.h:
  • jit/CallFrameShuffler.cpp:

(JSC::CallFrameShuffler::CallFrameShuffler):
(JSC::CallFrameShuffler::prepareForTailCall):

  • jit/CallFrameShuffler.h:

(JSC::CallFrameShuffler::snapshot const):

  • jit/JIT.cpp:

(JSC::JIT::JIT):
(JSC::JIT::emitEnterOptimizationCheck):
(JSC::JIT::emitNotifyWriteWatchpoint):
(JSC::JIT::emitVarReadOnlyCheck):
(JSC::JIT::assertStackPointerOffset):
(JSC::JIT::resetSP):
(JSC::JIT::emitPutCodeBlockToFrameInPrologue):
(JSC::JIT::privateCompileMainPass):
(JSC::JIT::privateCompileSlowCases):
(JSC::JIT::emitMaterializeMetadataAndConstantPoolRegisters):
(JSC::JIT::emitRestoreCalleeSaves):
(JSC::JIT::compileAndLinkWithoutFinalizing):
(JSC::JIT::link):
(JSC::JIT::finalizeOnMainThread):
(JSC::JIT::privateCompile):
(JSC::JIT::frameRegisterCountFor):
(JSC::JIT::stackPointerOffsetFor):

  • jit/JIT.h:
  • jit/JITArithmetic.cpp:

(JSC::JIT::emit_compareAndJumpSlowImpl):
(JSC::JIT::emit_compareAndJumpSlow):
(JSC::JIT::emit_op_negate):
(JSC::JIT::emit_op_add):
(JSC::JIT::emitMathICFast):
(JSC::JIT::emitMathICSlow):
(JSC::JIT::emit_op_div):
(JSC::JIT::emit_op_mul):
(JSC::JIT::emit_op_sub):

  • jit/JITCall.cpp:

(JSC::JIT::emitPutCallResult):
(JSC::JIT::compileSetupFrame):
(JSC::JIT::compileCallEval):
(JSC::JIT::compileCallEvalSlowCase):
(JSC::JIT::compileTailCall):
(JSC::JIT::compileOpCall):
(JSC::JIT::compileOpCallSlowCase):
(JSC::JIT::emit_op_iterator_open):
(JSC::JIT::emitSlow_op_iterator_open):
(JSC::JIT::emit_op_iterator_next):
(JSC::JIT::emitSlow_op_iterator_next):

  • jit/JITCall32_64.cpp:

(JSC::JIT::emitPutCallResult):
(JSC::JIT::compileSetupFrame):
(JSC::JIT::compileCallEval):
(JSC::JIT::compileCallEvalSlowCase):
(JSC::JIT::compileOpCall):
(JSC::JIT::compileOpCallSlowCase):
(JSC::JIT::emit_op_iterator_open):
(JSC::JIT::emitSlow_op_iterator_open):
(JSC::JIT::emit_op_iterator_next):
(JSC::JIT::emitSlow_op_iterator_next):

  • jit/JITCode.h:

(JSC::JITCode::useDataIC):
(JSC::JITCode::pcToCodeOriginMap):

  • jit/JITCompilationKey.cpp:

(JSC::JITCompilationKey::dump const):

  • jit/JITCompilationKey.h:

(JSC::JITCompilationKey::JITCompilationKey):
(JSC::JITCompilationKey::operator! const):
(JSC::JITCompilationKey::isHashTableDeletedValue const):
(JSC::JITCompilationKey::operator== const):
(JSC::JITCompilationKey::hash const):
(JSC::JITCompilationKey::profiledBlock const): Deleted.

  • jit/JITInlineCacheGenerator.cpp:

(JSC::JITInlineCacheGenerator::JITInlineCacheGenerator):
(JSC::JITInlineCacheGenerator::finalize):
(JSC::JITInlineCacheGenerator::generateBaselineDataICFastPath):
(JSC::JITGetByIdGenerator::JITGetByIdGenerator):
(JSC::generateGetByIdInlineAccess):
(JSC::JITGetByIdGenerator::generateBaselineDataICFastPath):
(JSC::JITGetByIdWithThisGenerator::generateBaselineDataICFastPath):
(JSC::JITPutByIdGenerator::generateBaselineDataICFastPath):
(JSC::JITDelByValGenerator::generateFastPath):
(JSC::JITDelByIdGenerator::generateFastPath):
(JSC::JITInByValGenerator::generateFastPath):
(JSC::JITInByIdGenerator::generateBaselineDataICFastPath):
(JSC::JITInstanceOfGenerator::generateFastPath):
(JSC::JITGetByValGenerator::generateFastPath):
(JSC::JITPutByValGenerator::generateFastPath):
(JSC::JITPrivateBrandAccessGenerator::generateFastPath):

  • jit/JITInlineCacheGenerator.h:
  • jit/JITInlines.h:

(JSC::JIT::isOperandConstantDouble):
(JSC::JIT::isOperandConstantInt):
(JSC::JIT::isKnownCell):
(JSC::JIT::getConstantOperand):
(JSC::JIT::appendCallWithExceptionCheckSetJSValueResultWithProfile):
(JSC::JIT::linkSlowCaseIfNotJSCell):
(JSC::JIT::advanceToNextCheckpoint):
(JSC::JIT::emitJumpSlowToHotForCheckpoint):
(JSC::JIT::isOperandConstantChar):
(JSC::JIT::emitValueProfilingSite):
(JSC::JIT::emitValueProfilingSiteIfProfiledOpcode):
(JSC::JIT::emitArrayProfilingSiteWithCell):
(JSC::JIT::emitLoadDouble):
(JSC::JIT::emitJumpSlowCaseIfNotJSCell):
(JSC::JIT::emitGetVirtualRegister):
(JSC::JIT::jumpTarget):
(JSC::JIT::loadPtrFromMetadata):
(JSC::JIT::load32FromMetadata):
(JSC::JIT::load8FromMetadata):
(JSC::JIT::store8ToMetadata):
(JSC::JIT::store32ToMetadata):
(JSC::JIT::materializePointerIntoMetadata):
(JSC::JIT::loadConstant):
(JSC::JIT::loadGlobalObject):
(JSC::JIT::loadCodeBlockConstant):
(JSC::JIT::copiedGetPutInfo): Deleted.
(JSC::JIT::copiedArithProfile): Deleted.

  • jit/JITOpcodes.cpp:

(JSC::JIT::emit_op_mov):
(JSC::JIT::emit_op_new_object):
(JSC::JIT::emitSlow_op_new_object):
(JSC::JIT::emit_op_overrides_has_instance):
(JSC::JIT::emit_op_instanceof):
(JSC::JIT::emitSlow_op_instanceof):
(JSC::JIT::emit_op_typeof_is_undefined):
(JSC::JIT::op_ret_handlerGenerator):
(JSC::JIT::emit_op_to_primitive):
(JSC::JIT::emit_op_set_function_name):
(JSC::JIT::emit_op_jfalse):
(JSC::JIT::valueIsFalseyGenerator):
(JSC::JIT::emit_op_jeq_null):
(JSC::JIT::emit_op_jneq_null):
(JSC::JIT::emit_op_jeq_ptr):
(JSC::JIT::emit_op_jneq_ptr):
(JSC::JIT::emit_op_jtrue):
(JSC::JIT::valueIsTruthyGenerator):
(JSC::JIT::emit_op_throw):
(JSC::JIT::op_throw_handlerGenerator):
(JSC::JIT::emitSlow_op_jstricteq):
(JSC::JIT::emitSlow_op_jnstricteq):
(JSC::JIT::emit_op_to_number):
(JSC::JIT::emit_op_to_numeric):
(JSC::JIT::emit_op_to_object):
(JSC::JIT::emit_op_catch):
(JSC::JIT::emit_op_switch_imm):
(JSC::JIT::emit_op_switch_char):
(JSC::JIT::emit_op_switch_string):
(JSC::JIT::emit_op_debug):
(JSC::JIT::emit_op_eq_null):
(JSC::JIT::emit_op_neq_null):
(JSC::JIT::emit_op_enter):
(JSC::JIT::op_enter_handlerGenerator):
(JSC::JIT::emit_op_to_this):
(JSC::JIT::emit_op_create_this):
(JSC::JIT::emitSlow_op_eq):
(JSC::JIT::emitSlow_op_neq):
(JSC::JIT::emitSlow_op_jeq):
(JSC::JIT::emitSlow_op_jneq):
(JSC::JIT::emitSlow_op_instanceof_custom):
(JSC::JIT::emit_op_loop_hint):
(JSC::JIT::emitSlow_op_check_traps):
(JSC::JIT::op_check_traps_handlerGenerator):
(JSC::JIT::emit_op_new_regexp):
(JSC::JIT::emitNewFuncCommon):
(JSC::JIT::emitNewFuncExprCommon):
(JSC::JIT::emit_op_new_array):
(JSC::JIT::emit_op_new_array_with_size):
(JSC::JIT::emit_op_profile_type):
(JSC::JIT::emit_op_log_shadow_chicken_tail):
(JSC::JIT::emit_op_profile_control_flow):
(JSC::JIT::emit_op_get_argument):
(JSC::JIT::emit_op_get_prototype_of):

  • jit/JITOpcodes32_64.cpp:

(JSC::JIT::emit_op_new_object):
(JSC::JIT::emitSlow_op_new_object):
(JSC::JIT::emit_op_overrides_has_instance):
(JSC::JIT::emit_op_instanceof):
(JSC::JIT::emitSlow_op_instanceof):
(JSC::JIT::emitSlow_op_instanceof_custom):
(JSC::JIT::emit_op_typeof_is_undefined):
(JSC::JIT::emit_op_set_function_name):
(JSC::JIT::emit_op_jfalse):
(JSC::JIT::emit_op_jtrue):
(JSC::JIT::emit_op_jeq_null):
(JSC::JIT::emit_op_jneq_null):
(JSC::JIT::emit_op_jneq_ptr):
(JSC::JIT::emitSlow_op_eq):
(JSC::JIT::compileOpEqJumpSlow):
(JSC::JIT::emitSlow_op_neq):
(JSC::JIT::emitSlow_op_jstricteq):
(JSC::JIT::emitSlow_op_jnstricteq):
(JSC::JIT::emit_op_eq_null):
(JSC::JIT::emit_op_neq_null):
(JSC::JIT::emit_op_throw):
(JSC::JIT::emit_op_to_number):
(JSC::JIT::emit_op_to_numeric):
(JSC::JIT::emit_op_to_object):
(JSC::JIT::emit_op_catch):
(JSC::JIT::emit_op_switch_imm):
(JSC::JIT::emit_op_switch_char):
(JSC::JIT::emit_op_switch_string):
(JSC::JIT::emit_op_enter):
(JSC::JIT::emit_op_create_this):
(JSC::JIT::emit_op_to_this):
(JSC::JIT::emit_op_profile_type):
(JSC::JIT::emit_op_log_shadow_chicken_tail):

  • jit/JITOperations.cpp:

(JSC::JSC_DEFINE_JIT_OPERATION):

  • jit/JITOperations.h:
  • jit/JITPlan.cpp:

(JSC::JITPlan::key):

  • jit/JITPropertyAccess.cpp:

(JSC::JIT::emit_op_get_by_val):
(JSC::JIT::generateGetByValSlowCase):
(JSC::JIT::slow_op_get_by_val_prepareCallGenerator):
(JSC::JIT::emit_op_get_private_name):
(JSC::JIT::emitSlow_op_get_private_name):
(JSC::JIT::slow_op_get_private_name_prepareCallGenerator):
(JSC::JIT::emit_op_set_private_brand):
(JSC::JIT::emitSlow_op_set_private_brand):
(JSC::JIT::emit_op_check_private_brand):
(JSC::JIT::emitSlow_op_check_private_brand):
(JSC::JIT::emit_op_put_by_val):
(JSC::JIT::emitSlow_op_put_by_val):
(JSC::JIT::slow_op_put_by_val_prepareCallGenerator):
(JSC::JIT::emit_op_put_private_name):
(JSC::JIT::emitSlow_op_put_private_name):
(JSC::JIT::slow_op_put_private_name_prepareCallGenerator):
(JSC::JIT::emit_op_put_getter_by_id):
(JSC::JIT::emit_op_put_setter_by_id):
(JSC::JIT::emit_op_put_getter_setter_by_id):
(JSC::JIT::emit_op_put_getter_by_val):
(JSC::JIT::emit_op_put_setter_by_val):
(JSC::JIT::emit_op_del_by_id):
(JSC::JIT::emitSlow_op_del_by_id):
(JSC::JIT::slow_op_del_by_id_prepareCallGenerator):
(JSC::JIT::emit_op_del_by_val):
(JSC::JIT::emitSlow_op_del_by_val):
(JSC::JIT::slow_op_del_by_val_prepareCallGenerator):
(JSC::JIT::emit_op_try_get_by_id):
(JSC::JIT::emitSlow_op_try_get_by_id):
(JSC::JIT::emit_op_get_by_id_direct):
(JSC::JIT::emitSlow_op_get_by_id_direct):
(JSC::JIT::emit_op_get_by_id):
(JSC::JIT::emitSlow_op_get_by_id):
(JSC::JIT::emit_op_get_by_id_with_this):
(JSC::JIT::slow_op_get_by_id_prepareCallGenerator):
(JSC::JIT::emitSlow_op_get_by_id_with_this):
(JSC::JIT::slow_op_get_by_id_with_this_prepareCallGenerator):
(JSC::JIT::emit_op_put_by_id):
(JSC::JIT::emitSlow_op_put_by_id):
(JSC::JIT::slow_op_put_by_id_prepareCallGenerator):
(JSC::JIT::emit_op_in_by_id):
(JSC::JIT::emitSlow_op_in_by_id):
(JSC::JIT::emit_op_in_by_val):
(JSC::JIT::emitSlow_op_in_by_val):
(JSC::JIT::emitHasPrivate):
(JSC::JIT::emitHasPrivateSlow):
(JSC::JIT::emitSlow_op_has_private_name):
(JSC::JIT::emitSlow_op_has_private_brand):
(JSC::JIT::emitVarInjectionCheck):
(JSC::JIT::emitResolveClosure):
(JSC::JIT::emit_op_resolve_scope):
(JSC::JIT::generateOpResolveScopeThunk):
(JSC::JIT::slow_op_resolve_scopeGenerator):
(JSC::JIT::emit_op_get_from_scope):
(JSC::JIT::emitSlow_op_get_from_scope):
(JSC::JIT::generateOpGetFromScopeThunk):
(JSC::JIT::slow_op_get_from_scopeGenerator):
(JSC::JIT::emit_op_put_to_scope):
(JSC::JIT::emitSlow_op_put_to_scope):
(JSC::JIT::slow_op_put_to_scopeGenerator):
(JSC::JIT::emit_op_get_from_arguments):
(JSC::JIT::emit_op_get_internal_field):
(JSC::JIT::emit_op_enumerator_next):
(JSC::JIT::emit_op_enumerator_get_by_val):
(JSC::JIT::emit_enumerator_has_propertyImpl):
(JSC::JIT::emitWriteBarrier):
(JSC::JIT::emitPutGlobalVariable): Deleted.
(JSC::JIT::emitPutGlobalVariableIndirect): Deleted.
(JSC::JIT::emitPutClosureVar): Deleted.

  • jit/JITPropertyAccess32_64.cpp:

(JSC::JIT::emit_op_put_getter_by_id):
(JSC::JIT::emit_op_put_setter_by_id):
(JSC::JIT::emit_op_put_getter_setter_by_id):
(JSC::JIT::emit_op_put_getter_by_val):
(JSC::JIT::emit_op_put_setter_by_val):
(JSC::JIT::emit_op_del_by_id):
(JSC::JIT::emit_op_del_by_val):
(JSC::JIT::emitSlow_op_del_by_val):
(JSC::JIT::emitSlow_op_del_by_id):
(JSC::JIT::emit_op_get_by_val):
(JSC::JIT::emitSlow_op_get_by_val):
(JSC::JIT::emit_op_get_private_name):
(JSC::JIT::emitSlow_op_get_private_name):
(JSC::JIT::emit_op_put_private_name):
(JSC::JIT::emitSlow_op_put_private_name):
(JSC::JIT::emit_op_set_private_brand):
(JSC::JIT::emitSlow_op_set_private_brand):
(JSC::JIT::emit_op_check_private_brand):
(JSC::JIT::emitSlow_op_check_private_brand):
(JSC::JIT::emit_op_put_by_val):
(JSC::JIT::emitSlow_op_put_by_val):
(JSC::JIT::emit_op_try_get_by_id):
(JSC::JIT::emitSlow_op_try_get_by_id):
(JSC::JIT::emit_op_get_by_id_direct):
(JSC::JIT::emitSlow_op_get_by_id_direct):
(JSC::JIT::emit_op_get_by_id):
(JSC::JIT::emitSlow_op_get_by_id):
(JSC::JIT::emit_op_get_by_id_with_this):
(JSC::JIT::emitSlow_op_get_by_id_with_this):
(JSC::JIT::emit_op_put_by_id):
(JSC::JIT::emitSlow_op_put_by_id):
(JSC::JIT::emit_op_in_by_id):
(JSC::JIT::emitSlow_op_in_by_id):
(JSC::JIT::emit_op_in_by_val):
(JSC::JIT::emitSlow_op_in_by_val):
(JSC::JIT::emitHasPrivate):
(JSC::JIT::emitHasPrivateSlow):
(JSC::JIT::emitVarInjectionCheck):
(JSC::JIT::emit_op_resolve_scope):
(JSC::JIT::emit_op_get_from_scope):
(JSC::JIT::emitSlow_op_get_from_scope):
(JSC::JIT::emit_op_put_to_scope):
(JSC::JIT::emitSlow_op_put_to_scope):
(JSC::JIT::emit_op_get_from_arguments):
(JSC::JIT::emit_op_get_internal_field):

  • jit/Repatch.cpp:

(JSC::tryCacheGetBy):
(JSC::tryCachePutBy):
(JSC::tryCacheInBy):
(JSC::unlinkCall):

  • llint/LLIntSlowPaths.cpp:

(JSC::LLInt::jitCompileAndSetHeuristics):
(JSC::LLInt::LLINT_SLOW_PATH_DECL):

  • llint/LowLevelInterpreter.asm:
  • llint/LowLevelInterpreter32_64.asm:
  • llint/LowLevelInterpreter64.asm:
  • runtime/CacheableIdentifier.h:
  • runtime/CacheableIdentifierInlines.h:

(JSC::CacheableIdentifier::createFromIdentifierOwnedByCodeBlock):

  • runtime/CachedTypes.cpp:

(JSC::CachedCodeBlock::numBinaryArithProfiles const):
(JSC::CachedCodeBlock::numUnaryArithProfiles const):
(JSC::UnlinkedCodeBlock::UnlinkedCodeBlock):
(JSC::CachedCodeBlock<CodeBlockType>::encode):

  • runtime/CommonSlowPaths.cpp:

(JSC::updateArithProfileForUnaryArithOp):

  • runtime/FunctionExecutable.h:
  • runtime/Options.cpp:

(JSC::Options::recomputeDependentOptions):

  • runtime/OptionsList.h:
  • runtime/ScriptExecutable.cpp:

(JSC::ScriptExecutable::prepareForExecutionImpl):

  • wasm/WasmLLIntTierUpCounter.h:

(JSC::Wasm::LLIntTierUpCounter::optimizeAfterWarmUp):
(JSC::Wasm::LLIntTierUpCounter::optimizeSoon):

  • wasm/WasmTierUpCount.cpp:

(JSC::Wasm::TierUpCount::TierUpCount):

  • wasm/WasmTierUpCount.h:

(JSC::Wasm::TierUpCount::optimizeAfterWarmUp):
(JSC::Wasm::TierUpCount::optimizeNextInvocation):
(JSC::Wasm::TierUpCount::optimizeSoon):

Source/WTF:

  • wtf/Bag.h:
  • wtf/Packed.h:

(WTF::PackedAlignedPtr::operator* const):

Tools:

  • Scripts/run-jsc-stress-tests:
File:
1 edited

Legend:

Unmodified
Added
Removed
  • trunk/Source/JavaScriptCore/jit/JIT.cpp

    r283098 r283102  
    7070
    7171JIT::JIT(VM& vm, CodeBlock* codeBlock, BytecodeIndex loopOSREntryBytecodeIndex)
    72     : JSInterfaceJIT(&vm, codeBlock)
     72    : JSInterfaceJIT(&vm, nullptr)
    7373    , m_interpreter(vm.interpreter)
    7474    , m_labels(codeBlock ? codeBlock->instructions().size() : 0)
     
    7878    , m_loopOSREntryBytecodeIndex(loopOSREntryBytecodeIndex)
    7979{
     80    m_globalObjectConstant = m_constantPool.add(JITConstantPool::Type::GlobalObject);
     81    m_profiledCodeBlock = codeBlock;
     82    m_unlinkedCodeBlock = codeBlock->unlinkedCodeBlock();
    8083}
    8184
     
    9194
    9295    JumpList skipOptimize;
    93    
    94     skipOptimize.append(branchAdd32(Signed, TrustedImm32(Options::executionCounterIncrementForEntry()), AbsoluteAddress(m_codeBlock->addressOfJITExecuteCounter())));
     96    loadPtr(addressFor(CallFrameSlot::codeBlock), regT0);
     97    skipOptimize.append(branchAdd32(Signed, TrustedImm32(Options::executionCounterIncrementForEntry()), Address(regT0, CodeBlock::offsetOfJITExecuteCounter())));
    9598    ASSERT(!m_bytecodeIndex.offset());
    9699
     
    114117}
    115118
    116 void JIT::emitNotifyWrite(GPRReg pointerToSet)
    117 {
     119void JIT::emitNotifyWriteWatchpoint(GPRReg pointerToSet)
     120{
     121    auto ok = branchTestPtr(Zero, pointerToSet);
    118122    addSlowCase(branch8(NotEqual, Address(pointerToSet, WatchpointSet::offsetOfState()), TrustedImm32(IsInvalidated)));
    119 }
    120 
    121 void JIT::emitVarReadOnlyCheck(ResolveType resolveType)
    122 {
    123     if (resolveType == GlobalVar || resolveType == GlobalVarWithVarInjectionChecks)
    124         addSlowCase(branch8(Equal, AbsoluteAddress(m_codeBlock->globalObject()->varReadOnlyWatchpoint()->addressOfState()), TrustedImm32(IsInvalidated)));
     123    ok.link(this);
     124}
     125
     126void JIT::emitVarReadOnlyCheck(ResolveType resolveType, GPRReg scratchGPR)
     127{
     128    if (resolveType == GlobalVar || resolveType == GlobalVarWithVarInjectionChecks) {
     129        loadGlobalObject(scratchGPR);
     130        loadPtr(Address(scratchGPR, OBJECT_OFFSETOF(JSGlobalObject, m_varReadOnlyWatchpoint)), scratchGPR);
     131        addSlowCase(branch8(Equal, Address(scratchGPR, WatchpointSet::offsetOfState()), TrustedImm32(IsInvalidated)));
     132    }
    125133}
    126134
     
    130138        return;
    131139   
    132     addPtr(TrustedImm32(stackPointerOffsetFor(m_codeBlock) * sizeof(Register)), callFrameRegister, regT0);
     140    addPtr(TrustedImm32(stackPointerOffsetFor(m_unlinkedCodeBlock) * sizeof(Register)), callFrameRegister, regT0);
    133141    Jump ok = branchPtr(Equal, regT0, stackPointerRegister);
    134142    breakpoint();
    135143    ok.link(this);
     144}
     145
     146void JIT::resetSP()
     147{
     148    addPtr(TrustedImm32(stackPointerOffsetFor(m_unlinkedCodeBlock) * sizeof(Register)), callFrameRegister, stackPointerRegister);
     149    checkStackPointerAlignment();
    136150}
    137151
     
    182196}
    183197
     198void JIT::emitPutCodeBlockToFrameInPrologue(GPRReg result)
     199{
     200    RELEASE_ASSERT(m_unlinkedCodeBlock->codeType() == FunctionCode);
     201    emitGetFromCallFrameHeaderPtr(CallFrameSlot::callee, result);
     202    loadPtr(Address(result, JSFunction::offsetOfExecutableOrRareData()), result);
     203    auto hasExecutable = branchTestPtr(Zero, result, CCallHelpers::TrustedImm32(JSFunction::rareDataTag));
     204    loadPtr(Address(result, FunctionRareData::offsetOfExecutable() - JSFunction::rareDataTag), result);
     205    hasExecutable.link(this);
     206    if (m_unlinkedCodeBlock->isConstructor())
     207        loadPtr(Address(result, FunctionExecutable::offsetOfCodeBlockForConstruct()), result);
     208    else
     209        loadPtr(Address(result, FunctionExecutable::offsetOfCodeBlockForCall()), result);
     210
     211    loadPtr(Address(result, ExecutableToCodeBlockEdge::offsetOfCodeBlock()), result);
     212    emitPutToCallFrameHeader(result, CallFrameSlot::codeBlock);
     213
     214#if ASSERT_ENABLED
     215    probeDebug([=] (Probe::Context& ctx) {
     216        CodeBlock* codeBlock = ctx.fp<CallFrame*>()->codeBlock();
     217        RELEASE_ASSERT(codeBlock->jitType() == JITType::BaselineJIT);
     218    });
     219#endif
     220}
     221
    184222void JIT::privateCompileMainPass()
    185223{
    186224    if (JITInternal::verbose)
    187         dataLog("Compiling ", *m_codeBlock, "\n");
     225        dataLog("Compiling ", *m_profiledCodeBlock, "\n");
    188226   
    189227    jitAssertTagsInPlace();
    190228    jitAssertArgumentCountSane();
    191229   
    192     auto& instructions = m_codeBlock->instructions();
    193     unsigned instructionCount = m_codeBlock->instructions().size();
     230    auto& instructions = m_unlinkedCodeBlock->instructions();
     231    unsigned instructionCount = m_unlinkedCodeBlock->instructions().size();
    194232
    195233    m_callLinkInfoIndex = 0;
    196234
    197     VM& vm = m_codeBlock->vm();
    198235    BytecodeIndex startBytecodeIndex(0);
    199     if (m_loopOSREntryBytecodeIndex && (m_codeBlock->inherits<ProgramCodeBlock>(vm) || m_codeBlock->inherits<ModuleProgramCodeBlock>(vm))) {
    200         // We can only do this optimization because we execute ProgramCodeBlock's exactly once.
    201         // This optimization would be invalid otherwise. When the LLInt determines it wants to
    202         // do OSR entry into the baseline JIT in a loop, it will pass in the bytecode offset it
    203         // was executing at when it kicked off our compilation. We only need to compile code for
    204         // anything reachable from that bytecode offset.
    205 
    206         // We only bother building the bytecode graph if it could save time and executable
    207         // memory. We pick an arbitrary offset where we deem this is profitable.
    208         if (m_loopOSREntryBytecodeIndex.offset() >= 200) {
    209             // As a simplification, we don't find all bytecode ranges that are unreachable.
    210             // Instead, we just find the minimum bytecode offset that is reachable, and
    211             // compile code from that bytecode offset onwards.
    212 
    213             BytecodeGraph graph(m_codeBlock, m_codeBlock->instructions());
    214             BytecodeBasicBlock* block = graph.findBasicBlockForBytecodeOffset(m_loopOSREntryBytecodeIndex.offset());
    215             RELEASE_ASSERT(block);
    216 
    217             GraphNodeWorklist<BytecodeBasicBlock*> worklist;
    218             startBytecodeIndex = BytecodeIndex();
    219             worklist.push(block);
    220 
    221             while (BytecodeBasicBlock* block = worklist.pop()) {
    222                 startBytecodeIndex = BytecodeIndex(std::min(startBytecodeIndex.offset(), block->leaderOffset()));
    223                 for (unsigned successorIndex : block->successors())
    224                     worklist.push(&graph[successorIndex]);
    225 
    226                 // Also add catch blocks for bytecodes that throw.
    227                 if (m_codeBlock->numberOfExceptionHandlers()) {
    228                     for (unsigned bytecodeOffset = block->leaderOffset(); bytecodeOffset < block->leaderOffset() + block->totalLength();) {
    229                         auto instruction = instructions.at(bytecodeOffset);
    230                         if (auto* handler = m_codeBlock->handlerForBytecodeIndex(BytecodeIndex(bytecodeOffset)))
    231                             worklist.push(graph.findBasicBlockWithLeaderOffset(handler->target));
    232 
    233                         bytecodeOffset += instruction->size();
    234                     }
    235                 }
    236             }
    237         }
    238     }
    239236
    240237    m_bytecodeCountHavingSlowCase = 0;
     
    279276        unsigned bytecodeOffset = m_bytecodeIndex.offset();
    280277        if (UNLIKELY(Options::traceBaselineJITExecution())) {
    281             CodeBlock* codeBlock = m_codeBlock;
    282278            probeDebug([=] (Probe::Context& ctx) {
     279                CodeBlock* codeBlock = ctx.fp<CallFrame*>()->codeBlock();
    283280                dataLogLn("JIT [", bytecodeOffset, "] ", opcodeNames[opcodeID], " cfr ", RawPointer(ctx.fp()), " @ ", codeBlock);
    284281            });
    285282        }
     283
     284        if (opcodeID != op_catch)
     285            assertStackPointerOffset();
    286286
    287287        switch (opcodeID) {
     
    529529        BytecodeIndex firstTo = m_bytecodeIndex;
    530530
    531         const Instruction* currentInstruction = m_codeBlock->instructions().at(m_bytecodeIndex).ptr();
     531        const Instruction* currentInstruction = m_unlinkedCodeBlock->instructions().at(m_bytecodeIndex).ptr();
    532532       
    533533        if (JITInternal::verbose)
     
    547547        if (UNLIKELY(Options::traceBaselineJITExecution())) {
    548548            unsigned bytecodeOffset = m_bytecodeIndex.offset();
    549             CodeBlock* codeBlock = m_codeBlock;
    550549            probeDebug([=] (Probe::Context& ctx) {
     550                CodeBlock* codeBlock = ctx.fp<CallFrame*>()->codeBlock();
    551551                dataLogLn("JIT [", bytecodeOffset, "] SLOW ", opcodeNames[opcodeID], " cfr ", RawPointer(ctx.fp()), " @ ", codeBlock);
    552552            });
     
    676676}
    677677
     678void JIT::emitMaterializeMetadataAndConstantPoolRegisters()
     679{
     680    loadPtr(addressFor(CallFrameSlot::codeBlock), regT0);
     681    loadPtr(Address(regT0, CodeBlock::offsetOfMetadataTable()), s_metadataGPR);
     682    loadPtr(Address(regT0, CodeBlock::offsetOfJITData()), regT0);
     683    loadPtr(Address(regT0, CodeBlock::JITData::offsetOfJITConstantPool()), s_constantsGPR);
     684}
     685
     686void JIT::emitRestoreCalleeSaves()
     687{
     688    Base::emitRestoreCalleeSavesFor(&RegisterAtOffsetList::llintBaselineCalleeSaveRegisters());
     689}
     690
    678691void JIT::compileAndLinkWithoutFinalizing(JITCompilationEffort effort)
    679692{
    680     DFG::CapabilityLevel level = m_codeBlock->capabilityLevel();
     693    DFG::CapabilityLevel level = m_profiledCodeBlock->capabilityLevel();
    681694    switch (level) {
    682695    case DFG::CannotCompile:
    683696        m_canBeOptimized = false;
    684         m_canBeOptimizedOrInlined = false;
    685697        m_shouldEmitProfiling = false;
    686698        break;
     
    688700    case DFG::CanCompileAndInline:
    689701        m_canBeOptimized = true;
    690         m_canBeOptimizedOrInlined = true;
    691702        m_shouldEmitProfiling = true;
    692703        break;
     
    695706        break;
    696707    }
    697    
    698     switch (m_codeBlock->codeType()) {
    699     case GlobalCode:
    700     case ModuleCode:
    701     case EvalCode:
    702         m_codeBlock->m_shouldAlwaysBeInlined = false;
    703         break;
    704     case FunctionCode:
    705         // We could have already set it to false because we detected an uninlineable call.
    706         // Don't override that observation.
    707         m_codeBlock->m_shouldAlwaysBeInlined &= canInline(level) && DFG::mightInlineFunction(m_codeBlock);
    708         break;
    709     }
    710 
    711     if (m_codeBlock->numberOfUnlinkedSwitchJumpTables() || m_codeBlock->numberOfUnlinkedStringSwitchJumpTables()) {
    712         ConcurrentJSLocker locker(m_codeBlock->m_lock);
    713         if (m_codeBlock->numberOfUnlinkedSwitchJumpTables())
    714             m_codeBlock->ensureJITData(locker).m_switchJumpTables = FixedVector<SimpleJumpTable>(m_codeBlock->numberOfUnlinkedSwitchJumpTables());
    715         if (m_codeBlock->numberOfUnlinkedStringSwitchJumpTables())
    716             m_codeBlock->ensureJITData(locker).m_stringSwitchJumpTables = FixedVector<StringJumpTable>(m_codeBlock->numberOfUnlinkedStringSwitchJumpTables());
    717     }
    718 
    719     if (UNLIKELY(Options::dumpDisassembly() || (m_vm->m_perBytecodeProfiler && Options::disassembleBaselineForProfiler())))
    720         m_disassembler = makeUnique<JITDisassembler>(m_codeBlock);
     708
     709    if (m_unlinkedCodeBlock->numberOfUnlinkedSwitchJumpTables() || m_unlinkedCodeBlock->numberOfUnlinkedStringSwitchJumpTables()) {
     710        if (m_unlinkedCodeBlock->numberOfUnlinkedSwitchJumpTables())
     711            m_switchJumpTables = FixedVector<SimpleJumpTable>(m_unlinkedCodeBlock->numberOfUnlinkedSwitchJumpTables());
     712        if (m_unlinkedCodeBlock->numberOfUnlinkedStringSwitchJumpTables())
     713            m_stringSwitchJumpTables = FixedVector<StringJumpTable>(m_unlinkedCodeBlock->numberOfUnlinkedStringSwitchJumpTables());
     714    }
     715
     716    if (UNLIKELY(Options::dumpDisassembly() || (m_vm->m_perBytecodeProfiler && Options::disassembleBaselineForProfiler()))) {
     717        // FIXME: build a disassembler off of UnlinkedCodeBlock.
     718        m_disassembler = makeUnique<JITDisassembler>(m_profiledCodeBlock);
     719    }
    721720    if (UNLIKELY(m_vm->m_perBytecodeProfiler)) {
     721        // FIXME: build profiler disassembler off UnlinkedCodeBlock.
    722722        m_compilation = adoptRef(
    723723            new Profiler::Compilation(
    724                 m_vm->m_perBytecodeProfiler->ensureBytecodesFor(m_codeBlock),
     724                m_vm->m_perBytecodeProfiler->ensureBytecodesFor(m_profiledCodeBlock),
    725725                Profiler::Baseline));
    726         m_compilation->addProfiledBytecodes(*m_vm->m_perBytecodeProfiler, m_codeBlock);
     726        m_compilation->addProfiledBytecodes(*m_vm->m_perBytecodeProfiler, m_profiledCodeBlock);
    727727    }
    728728   
     
    744744
    745745    emitFunctionPrologue();
    746     emitPutToCallFrameHeader(m_codeBlock, CallFrameSlot::codeBlock);
     746    if (m_unlinkedCodeBlock->codeType() == FunctionCode)
     747        emitPutCodeBlockToFrameInPrologue();
    747748
    748749    Label beginLabel(this);
    749750
    750     int frameTopOffset = stackPointerOffsetFor(m_codeBlock) * sizeof(Register);
     751    int frameTopOffset = stackPointerOffsetFor(m_unlinkedCodeBlock) * sizeof(Register);
    751752    unsigned maxFrameSize = -frameTopOffset;
    752753    addPtr(TrustedImm32(frameTopOffset), callFrameRegister, regT1);
     
    759760    checkStackPointerAlignment();
    760761
    761     emitSaveCalleeSaves();
     762    emitSaveCalleeSavesFor(&RegisterAtOffsetList::llintBaselineCalleeSaveRegisters());
    762763    emitMaterializeTagCheckRegisters();
    763 
    764     if (m_codeBlock->codeType() == FunctionCode) {
     764    emitMaterializeMetadataAndConstantPoolRegisters();
     765
     766    if (m_unlinkedCodeBlock->codeType() == FunctionCode) {
    765767        ASSERT(!m_bytecodeIndex);
    766         if (shouldEmitProfiling()) {
    767             for (unsigned argument = 0; argument < m_codeBlock->numParameters(); ++argument) {
     768        if (shouldEmitProfiling() && (!m_unlinkedCodeBlock->isConstructor() || m_unlinkedCodeBlock->numParameters() > 1)) {
     769            emitGetFromCallFrameHeaderPtr(CallFrameSlot::codeBlock, regT2);
     770            loadPtr(Address(regT2, CodeBlock::offsetOfArgumentValueProfiles() + FixedVector<ValueProfile>::offsetOfStorage()), regT2);
     771
     772            for (unsigned argument = 0; argument < m_unlinkedCodeBlock->numParameters(); ++argument) {
    768773                // If this is a constructor, then we want to put in a dummy profiling site (to
    769774                // keep things consistent) but we don't actually want to record the dummy value.
    770                 if (m_codeBlock->isConstructor() && !argument)
     775                if (m_unlinkedCodeBlock->isConstructor() && !argument)
    771776                    continue;
    772777                int offset = CallFrame::argumentOffsetIncludingThis(argument) * static_cast<int>(sizeof(Register));
     
    779784                load32(Address(callFrameRegister, offset + OBJECT_OFFSETOF(JSValue, u.asBits.tag)), resultRegs.tagGPR());
    780785#endif
    781                 emitValueProfilingSite(m_codeBlock->valueProfileForArgument(argument), resultRegs);
     786                storeValue(resultRegs, Address(regT2, argument * sizeof(ValueProfile) + ValueProfile::offsetOfFirstBucket()));
    782787            }
    783788        }
    784789    }
    785790   
    786     RELEASE_ASSERT(!JITCode::isJIT(m_codeBlock->jitType()));
     791    RELEASE_ASSERT(!JITCode::isJIT(m_profiledCodeBlock->jitType()));
    787792
    788793    if (UNLIKELY(sizeMarker))
     
    801806    if (maxFrameExtentForSlowPathCall)
    802807        addPtr(TrustedImm32(-static_cast<int32_t>(maxFrameExtentForSlowPathCall)), stackPointerRegister);
    803     callOperationWithCallFrameRollbackOnException(operationThrowStackOverflowError, m_codeBlock);
     808    emitGetFromCallFrameHeaderPtr(CallFrameSlot::codeBlock, regT0);
     809    callOperationWithCallFrameRollbackOnException(operationThrowStackOverflowError, regT0);
    804810
    805811    // If the number of parameters is 1, we never require arity fixup.
    806     bool requiresArityFixup = m_codeBlock->m_numParameters != 1;
    807     if (m_codeBlock->codeType() == FunctionCode && requiresArityFixup) {
     812    bool requiresArityFixup = m_unlinkedCodeBlock->numParameters() != 1;
     813    if (m_unlinkedCodeBlock->codeType() == FunctionCode && requiresArityFixup) {
    808814        m_arityCheck = label();
    809         store8(TrustedImm32(0), &m_codeBlock->m_shouldAlwaysBeInlined);
     815
    810816        emitFunctionPrologue();
    811         emitPutToCallFrameHeader(m_codeBlock, CallFrameSlot::codeBlock);
     817        emitPutCodeBlockToFrameInPrologue(regT0);
     818        store8(TrustedImm32(0), Address(regT0, CodeBlock::offsetOfShouldAlwaysBeInlined()));
    812819
    813820        load32(payloadFor(CallFrameSlot::argumentCountIncludingThis), regT1);
    814         branch32(AboveOrEqual, regT1, TrustedImm32(m_codeBlock->m_numParameters)).linkTo(beginLabel, this);
     821        branch32(AboveOrEqual, regT1, TrustedImm32(m_unlinkedCodeBlock->numParameters())).linkTo(beginLabel, this);
    815822
    816823        m_bytecodeIndex = BytecodeIndex(0);
     
    818825        if (maxFrameExtentForSlowPathCall)
    819826            addPtr(TrustedImm32(-static_cast<int32_t>(maxFrameExtentForSlowPathCall)), stackPointerRegister);
    820         callOperationWithCallFrameRollbackOnException(m_codeBlock->isConstructor() ? operationConstructArityCheck : operationCallArityCheck, m_codeBlock->globalObject());
     827        loadPtr(Address(regT0, CodeBlock::offsetOfGlobalObject()), argumentGPR0);
     828        callOperationWithCallFrameRollbackOnException(m_unlinkedCodeBlock->isConstructor() ? operationConstructArityCheck : operationCallArityCheck, argumentGPR0);
    821829        if (maxFrameExtentForSlowPathCall)
    822830            addPtr(TrustedImm32(maxFrameExtentForSlowPathCall), stackPointerRegister);
     
    841849    m_pcToCodeOriginMapBuilder.appendItem(label(), PCToCodeOriginMapBuilder::defaultCodeOrigin());
    842850
    843     m_linkBuffer = std::unique_ptr<LinkBuffer>(new LinkBuffer(*this, m_codeBlock, LinkBuffer::Profile::BaselineJIT, effort));
     851    m_linkBuffer = std::unique_ptr<LinkBuffer>(new LinkBuffer(*this, m_unlinkedCodeBlock, LinkBuffer::Profile::BaselineJIT, effort));
    844852    link();
    845853}
     
    860868        case SwitchRecord::Immediate:
    861869        case SwitchRecord::Character: {
    862             const UnlinkedSimpleJumpTable& unlinkedTable = m_codeBlock->unlinkedSwitchJumpTable(tableIndex);
    863             SimpleJumpTable& linkedTable = m_codeBlock->switchJumpTable(tableIndex);
     870            const UnlinkedSimpleJumpTable& unlinkedTable = m_unlinkedCodeBlock->unlinkedSwitchJumpTable(tableIndex);
     871            SimpleJumpTable& linkedTable = m_switchJumpTables[tableIndex];
    864872            linkedTable.m_ctiDefault = patchBuffer.locationOf<JSSwitchPtrTag>(m_labels[bytecodeOffset + record.defaultOffset]);
    865873            for (unsigned j = 0; j < unlinkedTable.m_branchOffsets.size(); ++j) {
     
    873881
    874882        case SwitchRecord::String: {
    875             const UnlinkedStringJumpTable& unlinkedTable = m_codeBlock->unlinkedStringSwitchJumpTable(tableIndex);
    876             StringJumpTable& linkedTable = m_codeBlock->stringSwitchJumpTable(tableIndex);
     883            const UnlinkedStringJumpTable& unlinkedTable = m_unlinkedCodeBlock->unlinkedStringSwitchJumpTable(tableIndex);
     884            StringJumpTable& linkedTable = m_stringSwitchJumpTables[tableIndex];
    877885            auto ctiDefault = patchBuffer.locationOf<JSSwitchPtrTag>(m_labels[bytecodeOffset + record.defaultOffset]);
    878886            for (auto& location : unlinkedTable.m_offsetTable.values()) {
     
    908916    }
    909917
     918#if USE(JSVALUE64)
     919    auto finalizeICs = [&] (auto& generators) {
     920        for (auto& gen : generators) {
     921            gen.m_unlinkedStubInfo->start = patchBuffer.locationOf<JITStubRoutinePtrTag>(gen.m_start);
     922            gen.m_unlinkedStubInfo->doneLocation = patchBuffer.locationOf<JSInternalPtrTag>(gen.m_done);
     923            gen.m_unlinkedStubInfo->slowPathStartLocation = patchBuffer.locationOf<JITStubRoutinePtrTag>(gen.m_slowPathBegin);
     924        }
     925    };
     926
     927    finalizeICs(m_getByIds);
     928    finalizeICs(m_getByVals);
     929    finalizeICs(m_getByIdsWithThis);
     930    finalizeICs(m_putByIds);
     931    finalizeICs(m_putByVals);
     932    finalizeICs(m_delByIds);
     933    finalizeICs(m_delByVals);
     934    finalizeICs(m_inByIds);
     935    finalizeICs(m_inByVals);
     936    finalizeICs(m_instanceOfs);
     937    finalizeICs(m_privateBrandAccesses);
     938#else
    910939    finalizeInlineCaches(m_getByIds, patchBuffer);
    911940    finalizeInlineCaches(m_getByVals, patchBuffer);
     
    919948    finalizeInlineCaches(m_instanceOfs, patchBuffer);
    920949    finalizeInlineCaches(m_privateBrandAccesses, patchBuffer);
     950#endif
    921951
    922952    for (auto& compilationInfo : m_callCompilationInfo) {
     953#if USE(JSVALUE64)
     954        UnlinkedCallLinkInfo& info = *compilationInfo.unlinkedCallLinkInfo;
     955        info.doneLocation = patchBuffer.locationOf<JSInternalPtrTag>(compilationInfo.doneLocation);
     956#else
    923957        CallLinkInfo& info = *compilationInfo.callLinkInfo;
    924958        info.setCodeLocations(
    925959            patchBuffer.locationOf<JSInternalPtrTag>(compilationInfo.slowPathStart),
    926960            patchBuffer.locationOf<JSInternalPtrTag>(compilationInfo.doneLocation));
    927     }
    928 
    929     {
    930         JITCodeMapBuilder jitCodeMapBuilder;
    931         for (unsigned bytecodeOffset = 0; bytecodeOffset < m_labels.size(); ++bytecodeOffset) {
    932             if (m_labels[bytecodeOffset].isSet())
    933                 jitCodeMapBuilder.append(BytecodeIndex(bytecodeOffset), patchBuffer.locationOf<JSEntryPtrTag>(m_labels[bytecodeOffset]));
    934         }
    935         m_codeBlock->setJITCodeMap(jitCodeMapBuilder.finalize());
     961#endif
     962
     963    }
     964
     965    JITCodeMapBuilder jitCodeMapBuilder;
     966    for (unsigned bytecodeOffset = 0; bytecodeOffset < m_labels.size(); ++bytecodeOffset) {
     967        if (m_labels[bytecodeOffset].isSet())
     968            jitCodeMapBuilder.append(BytecodeIndex(bytecodeOffset), patchBuffer.locationOf<JSEntryPtrTag>(m_labels[bytecodeOffset]));
    936969    }
    937970
     
    942975
    943976    if (UNLIKELY(m_compilation)) {
     977        // FIXME: should we make the bytecode profiler know about UnlinkedCodeBlock?
    944978        if (Options::disassembleBaselineForProfiler())
    945979            m_disassembler->reportToProfiler(m_compilation.get(), patchBuffer);
    946         m_vm->m_perBytecodeProfiler->addCompilation(m_codeBlock, *m_compilation);
     980        m_vm->m_perBytecodeProfiler->addCompilation(m_profiledCodeBlock, *m_compilation);
    947981    }
    948982
     
    950984        m_pcToCodeOriginMap = makeUnique<PCToCodeOriginMap>(WTFMove(m_pcToCodeOriginMapBuilder), patchBuffer);
    951985   
     986    // FIXME: Make a version of CodeBlockWithJITType that knows about UnlinkedCodeBlock.
    952987    CodeRef<JSEntryPtrTag> result = FINALIZE_CODE(
    953988        patchBuffer, JSEntryPtrTag,
    954         "Baseline JIT code for %s", toCString(CodeBlockWithJITType(m_codeBlock, JITType::BaselineJIT)).data());
     989        "Baseline JIT code for %s", toCString(CodeBlockWithJITType(m_profiledCodeBlock, JITType::BaselineJIT)).data());
    955990   
    956991    MacroAssemblerCodePtr<JSEntryPtrTag> withArityCheck = patchBuffer.locationOf<JSEntryPtrTag>(m_arityCheck);
    957     m_jitCode = adoptRef(*new DirectJITCode(result, withArityCheck, JITType::BaselineJIT));
     992    m_jitCode = adoptRef(*new BaselineJITCode(result, withArityCheck));
     993
     994    m_jitCode->m_unlinkedCalls = WTFMove(m_unlinkedCalls);
     995    m_jitCode->m_evalCallLinkInfos = WTFMove(m_evalCallLinkInfos);
     996    m_jitCode->m_unlinkedStubInfos = WTFMove(m_unlinkedStubInfos);
     997    m_jitCode->m_switchJumpTables = WTFMove(m_switchJumpTables);
     998    m_jitCode->m_stringSwitchJumpTables = WTFMove(m_stringSwitchJumpTables);
     999    m_jitCode->m_jitCodeMap = jitCodeMapBuilder.finalize();
     1000    m_jitCode->adoptMathICs(m_mathICs);
     1001    m_jitCode->m_constantPool = WTFMove(m_constantPool);
     1002#if USE(JSVALUE64)
     1003    m_jitCode->m_isShareable = m_isShareable;
     1004#else
     1005    m_jitCode->m_isShareable = false;
     1006#endif
    9581007
    9591008    if (JITInternal::verbose)
    960         dataLogF("JIT generated code for %p at [%p, %p).\n", m_codeBlock, result.executableMemory()->start().untaggedPtr(), result.executableMemory()->end().untaggedPtr());
    961 }
    962 
    963 CompilationResult JIT::finalizeOnMainThread()
     1009        dataLogF("JIT generated code for %p at [%p, %p).\n", m_unlinkedCodeBlock, result.executableMemory()->start().untaggedPtr(), result.executableMemory()->end().untaggedPtr());
     1010}
     1011
     1012CompilationResult JIT::finalizeOnMainThread(CodeBlock* codeBlock)
    9641013{
    9651014    RELEASE_ASSERT(!isCompilationThread());
     
    9701019    m_linkBuffer->runMainThreadFinalizationTasks();
    9711020
    972     {
    973         ConcurrentJSLocker locker(m_codeBlock->m_lock);
    974         m_codeBlock->shrinkToFit(locker, CodeBlock::ShrinkMode::LateShrink);
    975     }
    976 
    977     for (size_t i = 0; i < m_codeBlock->numberOfExceptionHandlers(); ++i) {
    978         HandlerInfo& handler = m_codeBlock->exceptionHandler(i);
    979         // FIXME: <rdar://problem/39433318>.
    980         handler.nativeCode = m_codeBlock->jitCodeMap().find(BytecodeIndex(handler.target)).retagged<ExceptionHandlerPtrTag>();
    981     }
    982 
    9831021    if (m_pcToCodeOriginMap)
    984         m_codeBlock->setPCToCodeOriginMap(WTFMove(m_pcToCodeOriginMap));
     1022        m_jitCode->m_pcToCodeOriginMap = WTFMove(m_pcToCodeOriginMap);
    9851023
    9861024    m_vm->machineCodeBytesPerBytecodeWordForBaselineJIT->add(
    9871025        static_cast<double>(m_jitCode->size()) /
    988         static_cast<double>(m_codeBlock->instructionsSize()));
    989 
    990     m_codeBlock->setJITCode(m_jitCode.releaseNonNull());
     1026        static_cast<double>(m_unlinkedCodeBlock->instructionsSize()));
     1027
     1028    codeBlock->setupWithUnlinkedBaselineCode(m_jitCode.releaseNonNull());
    9911029
    9921030    return CompilationSuccessful;
     
    10001038}
    10011039
    1002 CompilationResult JIT::privateCompile(JITCompilationEffort effort)
     1040CompilationResult JIT::privateCompile(CodeBlock* codeBlock, JITCompilationEffort effort)
    10031041{
    10041042    doMainThreadPreparationBeforeCompile();
    10051043    compileAndLinkWithoutFinalizing(effort);
    1006     return finalizeOnMainThread();
     1044    return finalizeOnMainThread(codeBlock);
    10071045}
    10081046
     
    10441082}
    10451083
     1084unsigned JIT::frameRegisterCountFor(UnlinkedCodeBlock* codeBlock)
     1085{
     1086    ASSERT(static_cast<unsigned>(codeBlock->numCalleeLocals()) == WTF::roundUpToMultipleOf(stackAlignmentRegisters(), static_cast<unsigned>(codeBlock->numCalleeLocals())));
     1087
     1088    return roundLocalRegisterCountForFramePointerOffset(codeBlock->numCalleeLocals() + maxFrameExtentForSlowPathCallInRegisters);
     1089}
     1090
    10461091unsigned JIT::frameRegisterCountFor(CodeBlock* codeBlock)
    10471092{
    1048     ASSERT(static_cast<unsigned>(codeBlock->numCalleeLocals()) == WTF::roundUpToMultipleOf(stackAlignmentRegisters(), static_cast<unsigned>(codeBlock->numCalleeLocals())));
    1049 
    1050     return roundLocalRegisterCountForFramePointerOffset(codeBlock->numCalleeLocals() + maxFrameExtentForSlowPathCallInRegisters);
     1093    return frameRegisterCountFor(codeBlock->unlinkedCodeBlock());
     1094}
     1095
     1096int JIT::stackPointerOffsetFor(UnlinkedCodeBlock* codeBlock)
     1097{
     1098    return virtualRegisterForLocal(frameRegisterCountFor(codeBlock) - 1).offset();
    10511099}
    10521100
    10531101int JIT::stackPointerOffsetFor(CodeBlock* codeBlock)
    10541102{
    1055     return virtualRegisterForLocal(frameRegisterCountFor(codeBlock) - 1).offset();
     1103    return stackPointerOffsetFor(codeBlock->unlinkedCodeBlock());
    10561104}
    10571105
Note: See TracChangeset for help on using the changeset viewer.