Changeset 179357 in webkit
- Timestamp:
- Jan 29, 2015, 12:33:45 PM (11 years ago)
- Location:
- trunk/Source/JavaScriptCore
- Files:
-
- 2 added
- 5 deleted
- 45 edited
Legend:
- Unmodified
- Added
- Removed
-
trunk/Source/JavaScriptCore/CMakeLists.txt
r179223 r179357 67 67 bytecode/BytecodeLivenessAnalysis.cpp 68 68 bytecode/CallEdge.cpp 69 bytecode/CallEdgeProfile.cpp70 69 bytecode/CallLinkInfo.cpp 71 70 bytecode/CallLinkStatus.cpp … … 328 327 jit/ArityCheckFailReturnThunks.cpp 329 328 jit/BinarySwitch.cpp 330 jit/ClosureCallStubRoutine.cpp331 329 jit/ExecutableAllocator.cpp 332 330 jit/ExecutableAllocatorFixedVMPool.cpp … … 351 349 jit/JITThunks.cpp 352 350 jit/JITToDFGDeferredCompilationCallback.cpp 351 jit/PolymorphicCallStubRoutine.cpp 353 352 jit/Reg.cpp 354 353 jit/RegisterPreservationWrapperGenerator.cpp -
trunk/Source/JavaScriptCore/ChangeLog
r179349 r179357 1 2015-01-28 Filip Pizlo <[email protected]> 2 3 Polymorphic call inlining should be based on polymorphic call inline caching rather than logging 4 https://bugs.webkit.org/show_bug.cgi?id=140660 5 6 Reviewed by Geoffrey Garen. 7 8 When we first implemented polymorphic call inlining, we did the profiling based on a call 9 edge log. The idea was to store each call edge (a tuple of call site and callee) into a 10 global log that was processed lazily. Processing the log would give precise counts of call 11 edges, and could be used to drive well-informed inlining decisions - polymorphic or not. 12 This was a speed-up on throughput tests but a slow-down for latency tests. It was a net win 13 nonetheless. 14 15 Experience with this code shows three things. First, the call edge profiler is buggy and 16 complex. It would take work to fix the bugs. Second, the call edge profiler incurs lots of 17 overhead for latency code that we care deeply about. Third, it's not at all clear that 18 having call edge counts for every possible callee is any better than just having call edge 19 counts for the limited number of callees that an inline cache would catch. 20 21 So, this patch removes the call edge profiler and replaces it with a polymorphic call inline 22 cache. If we miss the basic call inline cache, we inflate the cache to be a jump to an 23 out-of-line stub that cases on the previously known callees. If that misses again, then we 24 rewrite that stub to include the new callee. We do this up to some number of callees. If we 25 hit the limit then we switch to using a plain virtual call. 26 27 Substantial speed-up on V8Spider; undoes the slow-down that the original call edge profiler 28 caused. Might be a SunSpider speed-up (below 1%), depending on hardware. 29 30 * CMakeLists.txt: 31 * JavaScriptCore.vcxproj/JavaScriptCore.vcxproj: 32 * JavaScriptCore.xcodeproj/project.pbxproj: 33 * bytecode/CallEdge.h: 34 (JSC::CallEdge::count): 35 (JSC::CallEdge::CallEdge): 36 * bytecode/CallEdgeProfile.cpp: Removed. 37 * bytecode/CallEdgeProfile.h: Removed. 38 * bytecode/CallEdgeProfileInlines.h: Removed. 39 * bytecode/CallLinkInfo.cpp: 40 (JSC::CallLinkInfo::unlink): 41 (JSC::CallLinkInfo::visitWeak): 42 * bytecode/CallLinkInfo.h: 43 * bytecode/CallLinkStatus.cpp: 44 (JSC::CallLinkStatus::CallLinkStatus): 45 (JSC::CallLinkStatus::computeFor): 46 (JSC::CallLinkStatus::computeFromCallLinkInfo): 47 (JSC::CallLinkStatus::isClosureCall): 48 (JSC::CallLinkStatus::makeClosureCall): 49 (JSC::CallLinkStatus::dump): 50 (JSC::CallLinkStatus::computeFromCallEdgeProfile): Deleted. 51 * bytecode/CallLinkStatus.h: 52 (JSC::CallLinkStatus::CallLinkStatus): 53 (JSC::CallLinkStatus::isSet): 54 (JSC::CallLinkStatus::variants): 55 (JSC::CallLinkStatus::size): 56 (JSC::CallLinkStatus::at): 57 (JSC::CallLinkStatus::operator[]): 58 (JSC::CallLinkStatus::canOptimize): 59 (JSC::CallLinkStatus::edges): Deleted. 60 (JSC::CallLinkStatus::canTrustCounts): Deleted. 61 * bytecode/CallVariant.cpp: 62 (JSC::variantListWithVariant): 63 (JSC::despecifiedVariantList): 64 * bytecode/CallVariant.h: 65 * bytecode/CodeBlock.cpp: 66 (JSC::CodeBlock::~CodeBlock): 67 (JSC::CodeBlock::linkIncomingPolymorphicCall): 68 (JSC::CodeBlock::unlinkIncomingCalls): 69 (JSC::CodeBlock::noticeIncomingCall): 70 * bytecode/CodeBlock.h: 71 (JSC::CodeBlock::isIncomingCallAlreadyLinked): Deleted. 72 * dfg/DFGAbstractInterpreterInlines.h: 73 (JSC::DFG::AbstractInterpreter<AbstractStateType>::executeEffects): 74 * dfg/DFGByteCodeParser.cpp: 75 (JSC::DFG::ByteCodeParser::addCallWithoutSettingResult): 76 (JSC::DFG::ByteCodeParser::handleCall): 77 (JSC::DFG::ByteCodeParser::handleInlining): 78 * dfg/DFGClobberize.h: 79 (JSC::DFG::clobberize): 80 * dfg/DFGConstantFoldingPhase.cpp: 81 (JSC::DFG::ConstantFoldingPhase::foldConstants): 82 * dfg/DFGDoesGC.cpp: 83 (JSC::DFG::doesGC): 84 * dfg/DFGDriver.cpp: 85 (JSC::DFG::compileImpl): 86 * dfg/DFGFixupPhase.cpp: 87 (JSC::DFG::FixupPhase::fixupNode): 88 * dfg/DFGNode.h: 89 (JSC::DFG::Node::hasHeapPrediction): 90 * dfg/DFGNodeType.h: 91 * dfg/DFGOperations.cpp: 92 * dfg/DFGPredictionPropagationPhase.cpp: 93 (JSC::DFG::PredictionPropagationPhase::propagate): 94 * dfg/DFGSafeToExecute.h: 95 (JSC::DFG::safeToExecute): 96 * dfg/DFGSpeculativeJIT32_64.cpp: 97 (JSC::DFG::SpeculativeJIT::emitCall): 98 (JSC::DFG::SpeculativeJIT::compile): 99 * dfg/DFGSpeculativeJIT64.cpp: 100 (JSC::DFG::SpeculativeJIT::emitCall): 101 (JSC::DFG::SpeculativeJIT::compile): 102 * dfg/DFGTierUpCheckInjectionPhase.cpp: 103 (JSC::DFG::TierUpCheckInjectionPhase::run): 104 (JSC::DFG::TierUpCheckInjectionPhase::removeFTLProfiling): Deleted. 105 * ftl/FTLCapabilities.cpp: 106 (JSC::FTL::canCompile): 107 * heap/Heap.cpp: 108 (JSC::Heap::collect): 109 * jit/BinarySwitch.h: 110 * jit/ClosureCallStubRoutine.cpp: Removed. 111 * jit/ClosureCallStubRoutine.h: Removed. 112 * jit/JITCall.cpp: 113 (JSC::JIT::compileOpCall): 114 * jit/JITCall32_64.cpp: 115 (JSC::JIT::compileOpCall): 116 * jit/JITOperations.cpp: 117 * jit/JITOperations.h: 118 (JSC::operationLinkPolymorphicCallFor): 119 (JSC::operationLinkClosureCallFor): Deleted. 120 * jit/JITStubRoutine.h: 121 * jit/JITWriteBarrier.h: 122 * jit/PolymorphicCallStubRoutine.cpp: Added. 123 (JSC::PolymorphicCallNode::~PolymorphicCallNode): 124 (JSC::PolymorphicCallNode::unlink): 125 (JSC::PolymorphicCallCase::dump): 126 (JSC::PolymorphicCallStubRoutine::PolymorphicCallStubRoutine): 127 (JSC::PolymorphicCallStubRoutine::~PolymorphicCallStubRoutine): 128 (JSC::PolymorphicCallStubRoutine::variants): 129 (JSC::PolymorphicCallStubRoutine::edges): 130 (JSC::PolymorphicCallStubRoutine::visitWeak): 131 (JSC::PolymorphicCallStubRoutine::markRequiredObjectsInternal): 132 * jit/PolymorphicCallStubRoutine.h: Added. 133 (JSC::PolymorphicCallNode::PolymorphicCallNode): 134 (JSC::PolymorphicCallCase::PolymorphicCallCase): 135 (JSC::PolymorphicCallCase::variant): 136 (JSC::PolymorphicCallCase::codeBlock): 137 * jit/Repatch.cpp: 138 (JSC::linkSlowFor): 139 (JSC::linkFor): 140 (JSC::revertCall): 141 (JSC::unlinkFor): 142 (JSC::linkVirtualFor): 143 (JSC::linkPolymorphicCall): 144 (JSC::linkClosureCall): Deleted. 145 * jit/Repatch.h: 146 * jit/ThunkGenerators.cpp: 147 (JSC::linkPolymorphicCallForThunkGenerator): 148 (JSC::linkPolymorphicCallThunkGenerator): 149 (JSC::linkPolymorphicCallThatPreservesRegsThunkGenerator): 150 (JSC::linkClosureCallForThunkGenerator): Deleted. 151 (JSC::linkClosureCallThunkGenerator): Deleted. 152 (JSC::linkClosureCallThatPreservesRegsThunkGenerator): Deleted. 153 * jit/ThunkGenerators.h: 154 (JSC::linkPolymorphicCallThunkGeneratorFor): 155 (JSC::linkClosureCallThunkGeneratorFor): Deleted. 156 * llint/LLIntSlowPaths.cpp: 157 (JSC::LLInt::jitCompileAndSetHeuristics): 158 * runtime/Options.h: 159 * runtime/VM.cpp: 160 (JSC::VM::prepareToDiscardCode): 161 (JSC::VM::ensureCallEdgeLog): Deleted. 162 * runtime/VM.h: 163 1 164 2015-01-29 Joseph Pecoraro <[email protected]> 2 165 -
trunk/Source/JavaScriptCore/JavaScriptCore.vcxproj/JavaScriptCore.vcxproj
r179223 r179357 315 315 <ClCompile Include="..\bytecode\BytecodeLivenessAnalysis.cpp" /> 316 316 <ClCompile Include="..\bytecode\CallEdge.cpp" /> 317 <ClCompile Include="..\bytecode\CallEdgeProfile.cpp" />318 317 <ClCompile Include="..\bytecode\CallLinkInfo.cpp" /> 319 318 <ClCompile Include="..\bytecode\CallLinkStatus.cpp" /> … … 599 598 <ClCompile Include="..\jit\AssemblyHelpers.cpp" /> 600 599 <ClCompile Include="..\jit\BinarySwitch.cpp" /> 601 <ClCompile Include="..\jit\ClosureCallStubRoutine.cpp" />602 600 <ClCompile Include="..\jit\ExecutableAllocator.cpp" /> 603 601 <ClCompile Include="..\jit\GCAwareJITStubRoutine.cpp" /> … … 622 620 <ClCompile Include="..\jit\JITThunks.cpp" /> 623 621 <ClCompile Include="..\jit\JITToDFGDeferredCompilationCallback.cpp" /> 622 <ClCompile Include="..\jit\PolymorphicCallStubRoutine.cpp" /> 624 623 <ClCompile Include="..\jit\Reg.cpp" /> 625 624 <ClCompile Include="..\jit\RegisterPreservationWrapperGenerator.cpp" /> … … 932 931 <ClInclude Include="..\bytecode\BytecodeUseDef.h" /> 933 932 <ClInclude Include="..\bytecode\CallEdge.h" /> 934 <ClInclude Include="..\bytecode\CallEdgeProfile.h" />935 <ClInclude Include="..\bytecode\CallEdgeProfileInlines.h" />936 933 <ClInclude Include="..\bytecode\CallLinkInfo.h" /> 937 934 <ClInclude Include="..\bytecode\CallLinkStatus.h" /> … … 1334 1331 <ClInclude Include="..\jit\BinarySwitch.h" /> 1335 1332 <ClInclude Include="..\jit\CCallHelpers.h" /> 1336 <ClInclude Include="..\jit\ClosureCallStubRoutine.h" />1337 1333 <ClInclude Include="..\jit\CompactJITCodeMap.h" /> 1338 1334 <ClInclude Include="..\jit\ExecutableAllocator.h" /> … … 1358 1354 <ClInclude Include="..\jit\JITWriteBarrier.h" /> 1359 1355 <ClInclude Include="..\jit\JSInterfaceJIT.h" /> 1356 <ClInclude Include="..\jit\PolymorphicCallStubRoutine.h" /> 1360 1357 <ClInclude Include="..\jit\Reg.h" /> 1361 1358 <ClInclude Include="..\jit\RegisterPreservationWrapperGenerator.h" /> -
trunk/Source/JavaScriptCore/JavaScriptCore.xcodeproj/project.pbxproj
r179345 r179357 282 282 0F3B3A2B15475000003ED0FF /* DFGValidate.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0F3B3A2915474FF4003ED0FF /* DFGValidate.cpp */; }; 283 283 0F3B3A2C15475002003ED0FF /* DFGValidate.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F3B3A2A15474FF4003ED0FF /* DFGValidate.h */; settings = {ATTRIBUTES = (Private, ); }; }; 284 0F3B7E2619A11B8000D9BC56 /* CallEdge.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F3B7E2019A11B8000D9BC56 /* CallEdge.h */; settings = {ATTRIBUTES = (Private, ); }; };285 0F3B7E2719A11B8000D9BC56 /* CallEdgeProfile.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0F3B7E2119A11B8000D9BC56 /* CallEdgeProfile.cpp */; };286 0F3B7E2819A11B8000D9BC56 /* CallEdgeProfile.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F3B7E2219A11B8000D9BC56 /* CallEdgeProfile.h */; settings = {ATTRIBUTES = (Private, ); }; };287 0F3B7E2919A11B8000D9BC56 /* CallEdgeProfileInlines.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F3B7E2319A11B8000D9BC56 /* CallEdgeProfileInlines.h */; settings = {ATTRIBUTES = (Private, ); }; };288 284 0F3B7E2A19A11B8000D9BC56 /* CallVariant.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0F3B7E2419A11B8000D9BC56 /* CallVariant.cpp */; }; 289 285 0F3B7E2B19A11B8000D9BC56 /* CallVariant.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F3B7E2519A11B8000D9BC56 /* CallVariant.h */; settings = {ATTRIBUTES = (Private, ); }; }; 290 0F3B7E2D19A12AAE00D9BC56 /* CallEdge.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0F3B7E2C19A12AAE00D9BC56 /* CallEdge.cpp */; };291 286 0F3D0BBC194A414300FC9CF9 /* ConstantStructureCheck.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0F3D0BBA194A414300FC9CF9 /* ConstantStructureCheck.cpp */; }; 292 287 0F3D0BBD194A414300FC9CF9 /* ConstantStructureCheck.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F3D0BBB194A414300FC9CF9 /* ConstantStructureCheck.h */; settings = {ATTRIBUTES = (Private, ); }; }; … … 356 351 0F64B2711A784BAF006E4E66 /* BinarySwitch.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0F64B26F1A784BAF006E4E66 /* BinarySwitch.cpp */; }; 357 352 0F64B2721A784BAF006E4E66 /* BinarySwitch.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F64B2701A784BAF006E4E66 /* BinarySwitch.h */; settings = {ATTRIBUTES = (Private, ); }; }; 353 0F64B2791A7957B2006E4E66 /* CallEdge.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0F64B2771A7957B2006E4E66 /* CallEdge.cpp */; }; 354 0F64B27A1A7957B2006E4E66 /* CallEdge.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F64B2781A7957B2006E4E66 /* CallEdge.h */; settings = {ATTRIBUTES = (Private, ); }; }; 358 355 0F666EC0183566F900D017F1 /* BytecodeLivenessAnalysisInlines.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F666EBE183566F900D017F1 /* BytecodeLivenessAnalysisInlines.h */; settings = {ATTRIBUTES = (Private, ); }; }; 359 356 0F666EC1183566F900D017F1 /* FullBytecodeLiveness.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F666EBF183566F900D017F1 /* FullBytecodeLiveness.h */; settings = {ATTRIBUTES = (Private, ); }; }; … … 387 384 0F714CA416EA92F000F3EBEB /* DFGBackwardsPropagationPhase.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0F714CA116EA92ED00F3EBEB /* DFGBackwardsPropagationPhase.cpp */; }; 388 385 0F714CA516EA92F200F3EBEB /* DFGBackwardsPropagationPhase.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F714CA216EA92ED00F3EBEB /* DFGBackwardsPropagationPhase.h */; settings = {ATTRIBUTES = (Private, ); }; }; 389 0F73D7AE165A142D00ACAB71 /* ClosureCallStubRoutine.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0F73D7AB165A142A00ACAB71 /* ClosureCallStubRoutine.cpp */; };390 0F73D7AF165A143000ACAB71 /* ClosureCallStubRoutine.h in Headers */ = {isa = PBXBuildFile; fileRef = 0F73D7AC165A142A00ACAB71 /* ClosureCallStubRoutine.h */; settings = {ATTRIBUTES = (Private, ); }; };391 386 0F743BAA16B88249009F9277 /* ARM64Disassembler.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 652A3A201651C66100A80AFE /* ARM64Disassembler.cpp */; }; 392 387 0F7576D218E1FEE9002EF4CD /* AccessorCallJITStubRoutine.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0F7576D018E1FEE9002EF4CD /* AccessorCallJITStubRoutine.cpp */; }; … … 603 598 0FE7211D193B9C590031F6ED /* DFGTransition.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0FE7211B193B9C590031F6ED /* DFGTransition.cpp */; }; 604 599 0FE7211E193B9C590031F6ED /* DFGTransition.h in Headers */ = {isa = PBXBuildFile; fileRef = 0FE7211C193B9C590031F6ED /* DFGTransition.h */; settings = {ATTRIBUTES = (Private, ); }; }; 600 0FE834171A6EF97B00D04847 /* PolymorphicCallStubRoutine.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0FE834151A6EF97B00D04847 /* PolymorphicCallStubRoutine.cpp */; }; 601 0FE834181A6EF97B00D04847 /* PolymorphicCallStubRoutine.h in Headers */ = {isa = PBXBuildFile; fileRef = 0FE834161A6EF97B00D04847 /* PolymorphicCallStubRoutine.h */; settings = {ATTRIBUTES = (Private, ); }; }; 605 602 0FE8534B1723CDA500B618F5 /* DFGDesiredWatchpoints.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 0FE853491723CDA500B618F5 /* DFGDesiredWatchpoints.cpp */; }; 606 603 0FE8534C1723CDA500B618F5 /* DFGDesiredWatchpoints.h in Headers */ = {isa = PBXBuildFile; fileRef = 0FE8534A1723CDA500B618F5 /* DFGDesiredWatchpoints.h */; settings = {ATTRIBUTES = (Private, ); }; }; … … 1951 1948 0F3B3A2915474FF4003ED0FF /* DFGValidate.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = DFGValidate.cpp; path = dfg/DFGValidate.cpp; sourceTree = "<group>"; }; 1952 1949 0F3B3A2A15474FF4003ED0FF /* DFGValidate.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = DFGValidate.h; path = dfg/DFGValidate.h; sourceTree = "<group>"; }; 1953 0F3B7E2019A11B8000D9BC56 /* CallEdge.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CallEdge.h; sourceTree = "<group>"; };1954 0F3B7E2119A11B8000D9BC56 /* CallEdgeProfile.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = CallEdgeProfile.cpp; sourceTree = "<group>"; };1955 0F3B7E2219A11B8000D9BC56 /* CallEdgeProfile.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CallEdgeProfile.h; sourceTree = "<group>"; };1956 0F3B7E2319A11B8000D9BC56 /* CallEdgeProfileInlines.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CallEdgeProfileInlines.h; sourceTree = "<group>"; };1957 1950 0F3B7E2419A11B8000D9BC56 /* CallVariant.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = CallVariant.cpp; sourceTree = "<group>"; }; 1958 1951 0F3B7E2519A11B8000D9BC56 /* CallVariant.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CallVariant.h; sourceTree = "<group>"; }; 1959 0F3B7E2C19A12AAE00D9BC56 /* CallEdge.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = CallEdge.cpp; sourceTree = "<group>"; };1960 1952 0F3D0BBA194A414300FC9CF9 /* ConstantStructureCheck.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = ConstantStructureCheck.cpp; sourceTree = "<group>"; }; 1961 1953 0F3D0BBB194A414300FC9CF9 /* ConstantStructureCheck.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = ConstantStructureCheck.h; sourceTree = "<group>"; }; … … 2026 2018 0F64B26F1A784BAF006E4E66 /* BinarySwitch.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = BinarySwitch.cpp; sourceTree = "<group>"; }; 2027 2019 0F64B2701A784BAF006E4E66 /* BinarySwitch.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = BinarySwitch.h; sourceTree = "<group>"; }; 2020 0F64B2771A7957B2006E4E66 /* CallEdge.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = CallEdge.cpp; sourceTree = "<group>"; }; 2021 0F64B2781A7957B2006E4E66 /* CallEdge.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CallEdge.h; sourceTree = "<group>"; }; 2028 2022 0F666EBE183566F900D017F1 /* BytecodeLivenessAnalysisInlines.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = BytecodeLivenessAnalysisInlines.h; sourceTree = "<group>"; }; 2029 2023 0F666EBF183566F900D017F1 /* FullBytecodeLiveness.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = FullBytecodeLiveness.h; sourceTree = "<group>"; }; … … 2057 2051 0F714CA116EA92ED00F3EBEB /* DFGBackwardsPropagationPhase.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = DFGBackwardsPropagationPhase.cpp; path = dfg/DFGBackwardsPropagationPhase.cpp; sourceTree = "<group>"; }; 2058 2052 0F714CA216EA92ED00F3EBEB /* DFGBackwardsPropagationPhase.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = DFGBackwardsPropagationPhase.h; path = dfg/DFGBackwardsPropagationPhase.h; sourceTree = "<group>"; }; 2059 0F73D7AB165A142A00ACAB71 /* ClosureCallStubRoutine.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = ClosureCallStubRoutine.cpp; sourceTree = "<group>"; };2060 0F73D7AC165A142A00ACAB71 /* ClosureCallStubRoutine.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = ClosureCallStubRoutine.h; sourceTree = "<group>"; };2061 2053 0F7576D018E1FEE9002EF4CD /* AccessorCallJITStubRoutine.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = AccessorCallJITStubRoutine.cpp; sourceTree = "<group>"; }; 2062 2054 0F7576D118E1FEE9002EF4CD /* AccessorCallJITStubRoutine.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AccessorCallJITStubRoutine.h; sourceTree = "<group>"; }; … … 2285 2277 0FE7211B193B9C590031F6ED /* DFGTransition.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = DFGTransition.cpp; path = dfg/DFGTransition.cpp; sourceTree = "<group>"; }; 2286 2278 0FE7211C193B9C590031F6ED /* DFGTransition.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = DFGTransition.h; path = dfg/DFGTransition.h; sourceTree = "<group>"; }; 2279 0FE834151A6EF97B00D04847 /* PolymorphicCallStubRoutine.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = PolymorphicCallStubRoutine.cpp; sourceTree = "<group>"; }; 2280 0FE834161A6EF97B00D04847 /* PolymorphicCallStubRoutine.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = PolymorphicCallStubRoutine.h; sourceTree = "<group>"; }; 2287 2281 0FE853491723CDA500B618F5 /* DFGDesiredWatchpoints.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = DFGDesiredWatchpoints.cpp; path = dfg/DFGDesiredWatchpoints.cpp; sourceTree = "<group>"; }; 2288 2282 0FE8534A1723CDA500B618F5 /* DFGDesiredWatchpoints.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = DFGDesiredWatchpoints.h; path = dfg/DFGDesiredWatchpoints.h; sourceTree = "<group>"; }; … … 3748 3742 0F64B2701A784BAF006E4E66 /* BinarySwitch.h */, 3749 3743 0F24E53D17EA9F5900ABB217 /* CCallHelpers.h */, 3750 0F73D7AB165A142A00ACAB71 /* ClosureCallStubRoutine.cpp */,3751 0F73D7AC165A142A00ACAB71 /* ClosureCallStubRoutine.h */,3752 3744 0FD82E37141AB14200179C94 /* CompactJITCodeMap.h */, 3753 3745 A7B48DB60EE74CFC00DCBDB6 /* ExecutableAllocator.cpp */, … … 3797 3789 A76F54A213B28AAB00EF2BCE /* JITWriteBarrier.h */, 3798 3790 A76C51741182748D00715B05 /* JSInterfaceJIT.h */, 3791 0FE834151A6EF97B00D04847 /* PolymorphicCallStubRoutine.cpp */, 3792 0FE834161A6EF97B00D04847 /* PolymorphicCallStubRoutine.h */, 3799 3793 0FA7A8E918B413C80052371D /* Reg.cpp */, 3800 3794 0FA7A8EA18B413C80052371D /* Reg.h */, … … 5028 5022 0F885E101849A3BE00F1E3FA /* BytecodeUseDef.h */, 5029 5023 0F8023E91613832300A0BA45 /* ByValInfo.h */, 5030 0F3B7E2C19A12AAE00D9BC56 /* CallEdge.cpp */, 5031 0F3B7E2019A11B8000D9BC56 /* CallEdge.h */, 5032 0F3B7E2119A11B8000D9BC56 /* CallEdgeProfile.cpp */, 5033 0F3B7E2219A11B8000D9BC56 /* CallEdgeProfile.h */, 5034 0F3B7E2319A11B8000D9BC56 /* CallEdgeProfileInlines.h */, 5024 0F64B2771A7957B2006E4E66 /* CallEdge.cpp */, 5025 0F64B2781A7957B2006E4E66 /* CallEdge.h */, 5035 5026 0F0B83AE14BCF71400885B4F /* CallLinkInfo.cpp */, 5036 5027 0F0B83AF14BCF71400885B4F /* CallLinkInfo.h */, … … 5484 5475 0F24E54217EA9F5900ABB217 /* CCallHelpers.h in Headers */, 5485 5476 BC6AAAE50E1F426500AD87D8 /* ClassInfo.h in Headers */, 5486 0F73D7AF165A143000ACAB71 /* ClosureCallStubRoutine.h in Headers */,5487 5477 969A07970ED1D3AE00F1F681 /* CodeBlock.h in Headers */, 5488 5478 0F8F94411667633200D61971 /* CodeBlockHash.h in Headers */, … … 5528 5518 41359CF30FDD89AD00206180 /* DateConversion.h in Headers */, 5529 5519 BC1166020E1997B4008066DD /* DateInstance.h in Headers */, 5520 0F64B27A1A7957B2006E4E66 /* CallEdge.h in Headers */, 5530 5521 14A1563210966365006FA260 /* DateInstanceCache.h in Headers */, 5531 5522 BCD2034C0E17135E002C7E82 /* DatePrototype.h in Headers */, … … 5575 5566 0F2FC77316E12F740038D976 /* DFGDCEPhase.h in Headers */, 5576 5567 0F8F2B9A172F0501007DBDA5 /* DFGDesiredIdentifiers.h in Headers */, 5577 0F3B7E2819A11B8000D9BC56 /* CallEdgeProfile.h in Headers */,5578 5568 C2C0F7CE17BBFC5B00464FE4 /* DFGDesiredTransitions.h in Headers */, 5579 5569 0FE8534C1723CDA500B618F5 /* DFGDesiredWatchpoints.h in Headers */, … … 5866 5856 BC18C4160E16F5CD00B34460 /* JSLexicalEnvironment.h in Headers */, 5867 5857 840480131021A1D9008E7F01 /* JSAPIValueWrapper.h in Headers */, 5868 0F3B7E2919A11B8000D9BC56 /* CallEdgeProfileInlines.h in Headers */,5869 5858 C2CF39C216E15A8100DD69BE /* JSAPIWrapperObject.h in Headers */, 5870 5859 A76140D2182982CB00750624 /* JSArgumentsIterator.h in Headers */, … … 6190 6179 E49DC16D12EF295300184A1F /* SourceProviderCacheItem.h in Headers */, 6191 6180 0FB7F39E15ED8E4600F167B2 /* SparseArrayValueMap.h in Headers */, 6192 0F3B7E2619A11B8000D9BC56 /* CallEdge.h in Headers */,6193 6181 A7386554118697B400540279 /* SpecializedThunkJIT.h in Headers */, 6194 6182 0F5541B21613C1FB00CE3E25 /* SpecialPointer.h in Headers */, … … 6237 6225 A7E5AB391799E4B200D2833D /* UDis86Disassembler.h in Headers */, 6238 6226 A7A8AF4117ADB5F3005AB174 /* Uint16Array.h in Headers */, 6227 0FE834181A6EF97B00D04847 /* PolymorphicCallStubRoutine.h in Headers */, 6239 6228 866739D313BFDE710023D87C /* Uint16WithFraction.h in Headers */, 6240 6229 A7A8AF4217ADB5F3005AB174 /* Uint32Array.h in Headers */, … … 6775 6764 0F93329D14CA7DC30085F3C6 /* CallLinkStatus.cpp in Sources */, 6776 6765 0F2B9CE419D0BA7D00B1D1B5 /* DFGInsertOSRHintsForUpdate.cpp in Sources */, 6777 0F73D7AE165A142D00ACAB71 /* ClosureCallStubRoutine.cpp in Sources */,6778 6766 969A07960ED1D3AE00F1F681 /* CodeBlock.cpp in Sources */, 6779 6767 0F8F94401667633000D61971 /* CodeBlockHash.cpp in Sources */, … … 6790 6778 A709F2F217A0AC2A00512E98 /* CommonSlowPaths.cpp in Sources */, 6791 6779 6553A33117A1F1EE008CF6F3 /* CommonSlowPathsExceptions.cpp in Sources */, 6780 0F64B2791A7957B2006E4E66 /* CallEdge.cpp in Sources */, 6792 6781 A7E5A3A71797432D00E893C0 /* CompilationResult.cpp in Sources */, 6793 6782 147F39C2107EC37600427A48 /* Completion.cpp in Sources */, … … 6961 6950 0F235BD817178E1C00690C7F /* FTLExitThunkGenerator.cpp in Sources */, 6962 6951 0F235BDA17178E1C00690C7F /* FTLExitValue.cpp in Sources */, 6963 0F3B7E2719A11B8000D9BC56 /* CallEdgeProfile.cpp in Sources */,6964 6952 A7F2996B17A0BB670010417A /* FTLFail.cpp in Sources */, 6953 0FE834171A6EF97B00D04847 /* PolymorphicCallStubRoutine.cpp in Sources */, 6965 6954 0FD8A31917D51F2200CA2C40 /* FTLForOSREntryJITCode.cpp in Sources */, 6966 6955 0F25F1AF181635F300522F39 /* FTLInlineCacheSize.cpp in Sources */, … … 7154 7143 0F38B01117CF078000B144D3 /* LLIntEntrypoint.cpp in Sources */, 7155 7144 0F4680A814BA7FAB00BFE272 /* LLIntExceptions.cpp in Sources */, 7156 0F3B7E2D19A12AAE00D9BC56 /* CallEdge.cpp in Sources */,7157 7145 0F4680A414BA7F8D00BFE272 /* LLIntSlowPaths.cpp in Sources */, 7158 7146 0F0B839C14BCF46300885B4F /* LLIntThunks.cpp in Sources */, -
trunk/Source/JavaScriptCore/bytecode/CallEdge.h
r173069 r179357 1 1 /* 2 * Copyright (C) 2014 Apple Inc. All rights reserved.2 * Copyright (C) 2014, 2015 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 31 31 namespace JSC { 32 32 33 typedef uint16_t CallEdgeCountType;34 35 33 class CallEdge { 36 34 public: 37 35 CallEdge(); 38 CallEdge(CallVariant, CallEdgeCountType);36 CallEdge(CallVariant, uint32_t); 39 37 40 38 bool operator!() const { return !m_callee; } 41 39 42 40 CallVariant callee() const { return m_callee; } 43 CallEdgeCountTypecount() const { return m_count; }41 uint32_t count() const { return m_count; } 44 42 45 43 CallEdge despecifiedClosure() const … … 50 48 void dump(PrintStream&) const; 51 49 52 p ublic:50 private: 53 51 CallVariant m_callee; 54 CallEdgeCountTypem_count;52 uint32_t m_count; 55 53 }; 56 54 57 inline CallEdge::CallEdge(CallVariant callee, CallEdgeCountTypecount)55 inline CallEdge::CallEdge(CallVariant callee, uint32_t count) 58 56 : m_callee(callee) 59 57 , m_count(count) -
trunk/Source/JavaScriptCore/bytecode/CallLinkInfo.cpp
r173069 r179357 30 30 #include "DFGThunks.h" 31 31 #include "JSCInlines.h" 32 #include "Repatch.h" 32 33 #include "RepatchBuffer.h" 34 #include <wtf/ListDump.h> 33 35 #include <wtf/NeverDestroyed.h> 34 36 … … 38 40 void CallLinkInfo::unlink(RepatchBuffer& repatchBuffer) 39 41 { 40 ASSERT(isLinked()); 42 if (!isLinked()) { 43 // We could be called even if we're not linked anymore because of how polymorphic calls 44 // work. Each callsite within the polymorphic call stub may separately ask us to unlink(). 45 RELEASE_ASSERT(!isOnList()); 46 return; 47 } 41 48 42 if (Options::showDisassembly()) 43 dataLog("Unlinking call from ", callReturnLocation, " to ", pointerDump(repatchBuffer.codeBlock()), "\n"); 44 45 repatchBuffer.revertJumpReplacementToBranchPtrWithPatch(RepatchBuffer::startOfBranchPtrWithPatchOnRegister(hotPathBegin), static_cast<MacroAssembler::RegisterID>(calleeGPR), 0); 46 repatchBuffer.relink( 47 callReturnLocation, 48 repatchBuffer.codeBlock()->vm()->getCTIStub(linkThunkGeneratorFor( 49 (callType == Construct || callType == ConstructVarargs)? CodeForConstruct : CodeForCall, 50 isFTL ? MustPreserveRegisters : RegisterPreservationNotRequired)).code()); 51 hasSeenShouldRepatch = false; 52 callee.clear(); 53 stub.clear(); 49 unlinkFor( 50 repatchBuffer, *this, 51 (callType == Construct || callType == ConstructVarargs)? CodeForConstruct : CodeForCall, 52 isFTL ? MustPreserveRegisters : RegisterPreservationNotRequired); 54 53 55 54 // It will be on a list if the callee has a code block. … … 62 61 if (isLinked()) { 63 62 if (stub) { 64 if (! Heap::isMarked(stub->executable())) {63 if (!stub->visitWeak(repatchBuffer)) { 65 64 if (Options::verboseOSR()) { 66 65 dataLog( 67 66 "Clearing closure call from ", *repatchBuffer.codeBlock(), " to ", 68 stub->executable()->hashFor(specializationKind()),69 " , stub routine ", RawPointer(stub.get()), ".\n");67 listDump(stub->variants()), ", stub routine ", RawPointer(stub.get()), 68 ".\n"); 70 69 } 71 70 unlink(repatchBuffer); … … 84 83 if (!!lastSeenCallee && !Heap::isMarked(lastSeenCallee.get())) 85 84 lastSeenCallee.clear(); 86 87 if (callEdgeProfile) {88 WTF::loadLoadFence();89 callEdgeProfile->visitWeak();90 }91 85 } 92 86 -
trunk/Source/JavaScriptCore/bytecode/CallLinkInfo.h
r173069 r179357 1 1 /* 2 * Copyright (C) 2012, 2014 Apple Inc. All rights reserved.2 * Copyright (C) 2012, 2014, 2015 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 27 27 #define CallLinkInfo_h 28 28 29 #include "CallEdgeProfile.h"30 #include "ClosureCallStubRoutine.h"31 29 #include "CodeLocation.h" 32 30 #include "CodeSpecializationKind.h" … … 34 32 #include "JSFunction.h" 35 33 #include "Opcode.h" 34 #include "PolymorphicCallStubRoutine.h" 36 35 #include "WriteBarrier.h" 37 36 #include <wtf/OwnPtr.h> … … 83 82 JITWriteBarrier<JSFunction> callee; 84 83 WriteBarrier<JSFunction> lastSeenCallee; 85 RefPtr< ClosureCallStubRoutine> stub;84 RefPtr<PolymorphicCallStubRoutine> stub; 86 85 bool isFTL : 1; 87 86 bool hasSeenShouldRepatch : 1; … … 89 88 unsigned callType : 5; // CallType 90 89 unsigned calleeGPR : 8; 91 u nsignedslowPathCount;90 uint32_t slowPathCount; 92 91 CodeOrigin codeOrigin; 93 OwnPtr<CallEdgeProfile> callEdgeProfile;94 92 95 93 bool isLinked() { return stub || callee; } -
trunk/Source/JavaScriptCore/bytecode/CallLinkStatus.cpp
r179241 r179357 1 1 /* 2 * Copyright (C) 2012 , 2013, 2014Apple Inc. All rights reserved.2 * Copyright (C) 2012-2015 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 48 48 } 49 49 50 m_ edges.append(CallEdge(CallVariant(value.asCell()), 1));50 m_variants.append(CallVariant(value.asCell())); 51 51 } 52 52 … … 130 130 UNUSED_PARAM(profiledBlock); 131 131 132 if (Options::callStatusShouldUseCallEdgeProfile()) {133 // Always trust the call edge profile over anything else since this has precise counts.134 // It can make the best possible decision because it never "forgets" what happened for any135 // call, with the exception of fading out the counts of old calls (for example if the136 // counter type is 16-bit then calls that happened more than 2^16 calls ago are given half137 // weight, and this compounds for every 2^15 [sic] calls after that). The combination of138 // high fidelity for recent calls and fading for older calls makes this the most useful139 // mechamism of choosing how to optimize future calls.140 CallEdgeProfile* edgeProfile = callLinkInfo.callEdgeProfile.get();141 WTF::loadLoadFence();142 if (edgeProfile) {143 CallLinkStatus result = computeFromCallEdgeProfile(edgeProfile);144 if (!!result)145 return result;146 }147 }148 149 132 return computeFromCallLinkInfo(locker, callLinkInfo); 150 133 } … … 166 149 // is probably OK for now. 167 150 151 // PolymorphicCallStubRoutine is a GCAwareJITStubRoutine, so if non-null, it will stay alive 152 // until next GC even if the CallLinkInfo is concurrently cleared. Also, the variants list is 153 // never mutated after the PolymorphicCallStubRoutine is instantiated. We have some conservative 154 // fencing in place to make sure that we see the variants list after construction. 155 if (PolymorphicCallStubRoutine* stub = callLinkInfo.stub.get()) { 156 WTF::loadLoadFence(); 157 158 CallEdgeList edges = stub->edges(); 159 160 // Now that we've loaded the edges list, there are no further concurrency concerns. We will 161 // just manipulate and prune this list to our liking - mostly removing entries that are too 162 // infrequent and ensuring that it's sorted in descending order of frequency. 163 164 RELEASE_ASSERT(edges.size()); 165 166 std::sort( 167 edges.begin(), edges.end(), 168 [] (CallEdge a, CallEdge b) { 169 return a.count() > b.count(); 170 }); 171 RELEASE_ASSERT(edges.first().count() >= edges.last().count()); 172 173 double totalCallsToKnown = 0; 174 double totalCallsToUnknown = callLinkInfo.slowPathCount; 175 CallVariantList variants; 176 for (size_t i = 0; i < edges.size(); ++i) { 177 CallEdge edge = edges[i]; 178 // If the call is at the tail of the distribution, then we don't optimize it and we 179 // treat it as if it was a call to something unknown. We define the tail as being either 180 // a call that doesn't belong to the N most frequent callees (N = 181 // maxPolymorphicCallVariantsForInlining) or that has a total call count that is too 182 // small. 183 if (i >= Options::maxPolymorphicCallVariantsForInlining() 184 || edge.count() < Options::frequentCallThreshold()) 185 totalCallsToUnknown += edge.count(); 186 else { 187 totalCallsToKnown += edge.count(); 188 variants.append(edge.callee()); 189 } 190 } 191 192 // Bail if we didn't find any calls that qualified. 193 RELEASE_ASSERT(!!totalCallsToKnown == !!variants.size()); 194 if (variants.isEmpty()) 195 return takesSlowPath(); 196 197 // We require that the distribution of callees is skewed towards a handful of common ones. 198 if (totalCallsToKnown / totalCallsToUnknown < Options::minimumCallToKnownRate()) 199 return takesSlowPath(); 200 201 RELEASE_ASSERT(totalCallsToKnown); 202 RELEASE_ASSERT(variants.size()); 203 204 CallLinkStatus result; 205 result.m_variants = variants; 206 result.m_couldTakeSlowPath = !!totalCallsToUnknown; 207 return result; 208 } 209 168 210 if (callLinkInfo.slowPathCount >= Options::couldTakeSlowCaseMinimumCount()) 169 211 return takesSlowPath(); 170 171 if (ClosureCallStubRoutine* stub = callLinkInfo.stub.get())172 return CallLinkStatus(stub->executable());173 212 174 213 JSFunction* target = callLinkInfo.lastSeenCallee.get(); … … 180 219 181 220 return CallLinkStatus(target); 182 }183 184 CallLinkStatus CallLinkStatus::computeFromCallEdgeProfile(CallEdgeProfile* edgeProfile)185 {186 // In cases where the call edge profile saw nothing, use the CallLinkInfo instead.187 if (!edgeProfile->totalCalls())188 return CallLinkStatus();189 190 // To do anything meaningful, we require that the majority of calls are to something we191 // know how to handle.192 unsigned numCallsToKnown = edgeProfile->numCallsToKnownCells();193 unsigned numCallsToUnknown = edgeProfile->numCallsToNotCell() + edgeProfile->numCallsToUnknownCell();194 195 // We require that the majority of calls were to something that we could possibly inline.196 if (numCallsToKnown <= numCallsToUnknown)197 return takesSlowPath();198 199 // We require that the number of such calls is greater than some minimal threshold, so that we200 // avoid inlining completely cold calls.201 if (numCallsToKnown < Options::frequentCallThreshold())202 return takesSlowPath();203 204 CallLinkStatus result;205 result.m_edges = edgeProfile->callEdges();206 result.m_couldTakeSlowPath = !!numCallsToUnknown;207 result.m_canTrustCounts = true;208 209 return result;210 221 } 211 222 … … 283 294 bool CallLinkStatus::isClosureCall() const 284 295 { 285 for (unsigned i = m_ edges.size(); i--;) {286 if (m_ edges[i].callee().isClosureCall())296 for (unsigned i = m_variants.size(); i--;) { 297 if (m_variants[i].isClosureCall()) 287 298 return true; 288 299 } … … 292 303 void CallLinkStatus::makeClosureCall() 293 304 { 294 ASSERT(!m_isProved); 295 for (unsigned i = m_edges.size(); i--;) 296 m_edges[i] = m_edges[i].despecifiedClosure(); 297 298 if (!ASSERT_DISABLED) { 299 // Doing this should not have created duplicates, because the CallEdgeProfile 300 // should despecify closures if doing so would reduce the number of known callees. 301 for (unsigned i = 0; i < m_edges.size(); ++i) { 302 for (unsigned j = i + 1; j < m_edges.size(); ++j) 303 ASSERT(m_edges[i].callee() != m_edges[j].callee()); 304 } 305 } 305 m_variants = despecifiedVariantList(m_variants); 306 306 } 307 307 … … 321 321 out.print(comma, "Could Take Slow Path"); 322 322 323 out.print(listDump(m_edges)); 323 if (!m_variants.isEmpty()) 324 out.print(comma, listDump(m_variants)); 324 325 } 325 326 -
trunk/Source/JavaScriptCore/bytecode/CallLinkStatus.h
r173069 r179357 1 1 /* 2 * Copyright (C) 2012 , 2013, 2014Apple Inc. All rights reserved.2 * Copyright (C) 2012-2015 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 28 28 29 29 #include "CallLinkInfo.h" 30 #include "CallVariant.h" 30 31 #include "CodeOrigin.h" 31 32 #include "CodeSpecializationKind.h" … … 49 50 : m_couldTakeSlowPath(false) 50 51 , m_isProved(false) 51 , m_canTrustCounts(false)52 52 { 53 53 } … … 63 63 64 64 CallLinkStatus(CallVariant variant) 65 : m_ edges(1, CallEdge(variant, 1))65 : m_variants(1, variant) 66 66 , m_couldTakeSlowPath(false) 67 67 , m_isProved(false) 68 , m_canTrustCounts(false)69 68 { 70 69 } … … 110 109 CodeBlock*, CodeOrigin, const CallLinkInfoMap&, const ContextMap&); 111 110 112 bool isSet() const { return !m_ edges.isEmpty() || m_couldTakeSlowPath; }111 bool isSet() const { return !m_variants.isEmpty() || m_couldTakeSlowPath; } 113 112 114 113 bool operator!() const { return !isSet(); } … … 116 115 bool couldTakeSlowPath() const { return m_couldTakeSlowPath; } 117 116 118 Call EdgeList edges() const { return m_edges; }119 unsigned size() const { return m_ edges.size(); }120 Call Edge at(unsigned i) const { return m_edges[i]; }121 Call Edgeoperator[](unsigned i) const { return at(i); }117 CallVariantList variants() const { return m_variants; } 118 unsigned size() const { return m_variants.size(); } 119 CallVariant at(unsigned i) const { return m_variants[i]; } 120 CallVariant operator[](unsigned i) const { return at(i); } 122 121 bool isProved() const { return m_isProved; } 123 bool canOptimize() const { return !m_edges.isEmpty(); } 124 bool canTrustCounts() const { return m_canTrustCounts; } 122 bool canOptimize() const { return !m_variants.isEmpty(); } 125 123 126 124 bool isClosureCall() const; // Returns true if any callee is a closure call. … … 133 131 static CallLinkStatus computeFromLLInt(const ConcurrentJITLocker&, CodeBlock*, unsigned bytecodeIndex); 134 132 #if ENABLE(JIT) 135 static CallLinkStatus computeFromCallEdgeProfile(CallEdgeProfile*);136 133 static CallLinkStatus computeFromCallLinkInfo( 137 134 const ConcurrentJITLocker&, CallLinkInfo&); 138 135 #endif 139 136 140 Call EdgeList m_edges;137 CallVariantList m_variants; 141 138 bool m_couldTakeSlowPath; 142 139 bool m_isProved; 143 bool m_canTrustCounts;144 140 }; 145 141 -
trunk/Source/JavaScriptCore/bytecode/CallVariant.cpp
r173069 r179357 1 1 /* 2 * Copyright (C) 2014 Apple Inc. All rights reserved.2 * Copyright (C) 2014, 2015 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 51 51 } 52 52 53 CallVariantList variantListWithVariant(const CallVariantList& list, CallVariant variantToAdd) 54 { 55 ASSERT(variantToAdd); 56 CallVariantList result; 57 for (CallVariant variant : list) { 58 ASSERT(variant); 59 if (!!variantToAdd) { 60 if (variant == variantToAdd) 61 variantToAdd = CallVariant(); 62 else if (variant.despecifiedClosure() == variantToAdd.despecifiedClosure()) { 63 variant = variant.despecifiedClosure(); 64 variantToAdd = CallVariant(); 65 } 66 } 67 result.append(variant); 68 } 69 if (!!variantToAdd) 70 result.append(variantToAdd); 71 return result; 72 } 73 74 CallVariantList despecifiedVariantList(const CallVariantList& list) 75 { 76 CallVariantList result; 77 for (CallVariant variant : list) 78 result = variantListWithVariant(result, variant.despecifiedClosure()); 79 return result; 80 } 81 53 82 } // namespace JSC 54 83 -
trunk/Source/JavaScriptCore/bytecode/CallVariant.h
r173517 r179357 1 1 /* 2 * Copyright (C) 2014 Apple Inc. All rights reserved.2 * Copyright (C) 2014, 2015 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 57 57 // 58 58 // This class serves as a kind of union over these four things. It does so by just holding a 59 // JSCell*. We determine which of the modes its in by doing type checks on the cell. Note that there 60 // is no lifecycle management for the cell because this class is always used in contexts where we 61 // either do custom weak reference logic over instances of this class (see CallEdgeProfile), or we 62 // are inside the compiler and we assume that the compiler runs in between collections and so can 63 // touch the heap without notifying anyone. 59 // JSCell*. We determine which of the modes its in by doing type checks on the cell. Note that we 60 // cannot use WriteBarrier<> here because this gets used inside the compiler. 64 61 65 62 class CallVariant { … … 182 179 typedef Vector<CallVariant, 1> CallVariantList; 183 180 181 // Returns a new variant list by attempting to either append the given variant or merge it with one 182 // of the variants we already have by despecifying closures. 183 CallVariantList variantListWithVariant(const CallVariantList&, CallVariant); 184 185 // Returns a new list where every element is despecified, and the list is deduplicated. 186 CallVariantList despecifiedVariantList(const CallVariantList&); 187 184 188 } // namespace JSC 185 189 -
trunk/Source/JavaScriptCore/bytecode/CodeBlock.cpp
r179015 r179357 1 1 /* 2 * Copyright (C) 2008 , 2009, 2010, 2012, 2013, 2014Apple Inc. All rights reserved.2 * Copyright (C) 2008-2010, 2012-2015 Apple Inc. All rights reserved. 3 3 * Copyright (C) 2008 Cameron Zwarich <[email protected]> 4 4 * … … 2182 2182 while (m_incomingCalls.begin() != m_incomingCalls.end()) 2183 2183 m_incomingCalls.begin()->remove(); 2184 while (m_incomingPolymorphicCalls.begin() != m_incomingPolymorphicCalls.end()) 2185 m_incomingPolymorphicCalls.begin()->remove(); 2184 2186 2185 2187 // Note that our outgoing calls will be removed from other CodeBlocks' … … 3048 3050 m_incomingCalls.push(incoming); 3049 3051 } 3052 3053 void CodeBlock::linkIncomingPolymorphicCall(ExecState* callerFrame, PolymorphicCallNode* incoming) 3054 { 3055 noticeIncomingCall(callerFrame); 3056 m_incomingPolymorphicCalls.push(incoming); 3057 } 3050 3058 #endif // ENABLE(JIT) 3051 3059 … … 3055 3063 m_incomingLLIntCalls.begin()->unlink(); 3056 3064 #if ENABLE(JIT) 3057 if (m_incomingCalls.isEmpty() )3065 if (m_incomingCalls.isEmpty() && m_incomingPolymorphicCalls.isEmpty()) 3058 3066 return; 3059 3067 RepatchBuffer repatchBuffer(this); 3060 3068 while (m_incomingCalls.begin() != m_incomingCalls.end()) 3061 3069 m_incomingCalls.begin()->unlink(repatchBuffer); 3070 while (m_incomingPolymorphicCalls.begin() != m_incomingPolymorphicCalls.end()) 3071 m_incomingPolymorphicCalls.begin()->unlink(repatchBuffer); 3062 3072 #endif // ENABLE(JIT) 3063 3073 } … … 3253 3263 3254 3264 if (Options::verboseCallLink()) 3255 dataLog("Noticing call link from ", *callerCodeBlock, " to ", *this, "\n"); 3256 3265 dataLog("Noticing call link from ", pointerDump(callerCodeBlock), " to ", *this, "\n"); 3266 3267 #if ENABLE(DFG_JIT) 3257 3268 if (!m_shouldAlwaysBeInlined) 3258 3269 return; 3259 3260 #if ENABLE(DFG_JIT) 3270 3271 if (!callerCodeBlock) { 3272 m_shouldAlwaysBeInlined = false; 3273 if (Options::verboseCallLink()) 3274 dataLog(" Clearing SABI because caller is native.\n"); 3275 return; 3276 } 3277 3261 3278 if (!hasBaselineJITProfiling()) 3262 3279 return; … … 3286 3303 } 3287 3304 3305 if (JITCode::isOptimizingJIT(callerCodeBlock->jitType())) { 3306 m_shouldAlwaysBeInlined = false; 3307 if (Options::verboseCallLink()) 3308 dataLog(" Clearing SABI bcause caller was already optimized.\n"); 3309 return; 3310 } 3311 3288 3312 if (callerCodeBlock->codeType() != FunctionCode) { 3289 3313 // If the caller is either eval or global code, assume that that won't be … … 3306 3330 return; 3307 3331 } 3308 3309 RELEASE_ASSERT(callerCodeBlock->m_capabilityLevelState != DFG::CapabilityLevelNotSet); 3332 3333 if (callerCodeBlock->m_capabilityLevelState == DFG::CapabilityLevelNotSet) { 3334 dataLog("In call from ", *callerCodeBlock, " ", callerFrame->codeOrigin(), " to ", *this, ": caller's DFG capability level is not set.\n"); 3335 CRASH(); 3336 } 3310 3337 3311 3338 if (canCompile(callerCodeBlock->m_capabilityLevelState)) -
trunk/Source/JavaScriptCore/bytecode/CodeBlock.h
r179015 r179357 230 230 231 231 void linkIncomingCall(ExecState* callerFrame, CallLinkInfo*); 232 233 bool isIncomingCallAlreadyLinked(CallLinkInfo* incoming) 234 { 235 return m_incomingCalls.isOnList(incoming); 236 } 232 void linkIncomingPolymorphicCall(ExecState* callerFrame, PolymorphicCallNode*); 237 233 #endif // ENABLE(JIT) 238 234 … … 1078 1074 Bag<CallLinkInfo> m_callLinkInfos; 1079 1075 SentinelLinkedList<CallLinkInfo, BasicRawSentinelNode<CallLinkInfo>> m_incomingCalls; 1076 SentinelLinkedList<PolymorphicCallNode, BasicRawSentinelNode<PolymorphicCallNode>> m_incomingPolymorphicCalls; 1080 1077 #endif 1081 1078 std::unique_ptr<CompactJITCodeMap> m_jitCodeMap; -
trunk/Source/JavaScriptCore/dfg/DFGAbstractInterpreterInlines.h
r177030 r179357 1969 1969 break; 1970 1970 1971 case ProfiledCall:1972 case ProfiledConstruct:1973 if (forNode(m_graph.varArgChild(node, 0)).m_value)1974 m_state.setFoundConstants(true);1975 clobberWorld(node->origin.semantic, clobberLimit);1976 forNode(node).makeHeapTop();1977 break;1978 1979 1971 case ForceOSRExit: 1980 1972 case CheckBadCell: -
trunk/Source/JavaScriptCore/dfg/DFGByteCodeParser.cpp
r178629 r179357 1 1 /* 2 * Copyright (C) 2011 , 2012, 2013, 2014Apple Inc. All rights reserved.2 * Copyright (C) 2011-2015 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 676 676 m_parameterSlots = parameterSlots; 677 677 678 int dummyThisArgument = op == Call || op == NativeCall || op == ProfiledCall? 0 : 1;678 int dummyThisArgument = op == Call || op == NativeCall ? 0 : 1; 679 679 for (int i = 0 + dummyThisArgument; i < argCount; ++i) 680 680 addVarArgChild(get(virtualRegisterForArgument(i, registerOffset))); … … 1045 1045 callLinkStatus = CallLinkStatus(callTarget->asJSValue()).setIsProved(true); 1046 1046 1047 if ((!callLinkStatus.canOptimize() || callLinkStatus.size() != 1) 1048 && !isFTL(m_graph.m_plan.mode) && Options::useFTLJIT() 1049 && InlineCallFrame::isNormalCall(kind) 1050 && CallEdgeLog::isEnabled() 1051 && Options::dfgDoesCallEdgeProfiling()) { 1052 ASSERT(op == Call || op == Construct); 1053 if (op == Call) 1054 op = ProfiledCall; 1055 else 1056 op = ProfiledConstruct; 1057 } 1047 if (Options::verboseDFGByteCodeParsing()) 1048 dataLog(" Handling call at ", currentCodeOrigin(), ": ", callLinkStatus, "\n"); 1058 1049 1059 1050 if (!callLinkStatus.canOptimize()) { … … 1077 1068 #if ENABLE(FTL_NATIVE_CALL_INLINING) 1078 1069 if (isFTL(m_graph.m_plan.mode) && Options::optimizeNativeCalls() && callLinkStatus.size() == 1 && !callLinkStatus.couldTakeSlowPath()) { 1079 CallVariant callee = callLinkStatus[0] .callee();1070 CallVariant callee = callLinkStatus[0]; 1080 1071 JSFunction* function = callee.function(); 1081 1072 CodeSpecializationKind specializationKind = InlineCallFrame::specializationKindFor(kind); … … 1084 1075 callOpInfo = OpInfo(m_graph.freeze(function)); 1085 1076 1086 if (op == Call || op == ProfiledCall)1077 if (op == Call) 1087 1078 op = NativeCall; 1088 1079 else { 1089 ASSERT(op == Construct || op == ProfiledConstruct);1080 ASSERT(op == Construct); 1090 1081 op = NativeConstruct; 1091 1082 } … … 1427 1418 if (!callLinkStatus.couldTakeSlowPath() && callLinkStatus.size() == 1) { 1428 1419 emitFunctionChecks( 1429 callLinkStatus[0] .callee(), callTargetNode, registerOffset, specializationKind);1420 callLinkStatus[0], callTargetNode, registerOffset, specializationKind); 1430 1421 bool result = attemptToInlineCall( 1431 callTargetNode, resultOperand, callLinkStatus[0] .callee(), registerOffset,1422 callTargetNode, resultOperand, callLinkStatus[0], registerOffset, 1432 1423 argumentCountIncludingThis, nextOffset, kind, CallerDoesNormalLinking, prediction, 1433 1424 inliningBalance); 1434 1425 if (!result && !callLinkStatus.isProved()) 1435 undoFunctionChecks(callLinkStatus[0] .callee());1426 undoFunctionChecks(callLinkStatus[0]); 1436 1427 if (verbose) { 1437 1428 dataLog("Done inlining (simple).\n"); … … 1463 1454 bool allAreDirectCalls = true; 1464 1455 for (unsigned i = callLinkStatus.size(); i--;) { 1465 if (callLinkStatus[i]. callee().isClosureCall())1456 if (callLinkStatus[i].isClosureCall()) 1466 1457 allAreDirectCalls = false; 1467 1458 else … … 1476 1467 else { 1477 1468 // FIXME: We should be able to handle this case, but it's tricky and we don't know of cases 1478 // where it would be beneficial. Also, CallLinkStatus would make all callees appear like 1479 // closure calls if any calls were closure calls - except for calls to internal functions. 1480 // So this will only arise if some callees are internal functions and others are closures. 1469 // where it would be beneficial. It might be best to handle these cases as if all calls were 1470 // closure calls. 1481 1471 // https://bugs.webkit.org/show_bug.cgi?id=136020 1482 1472 if (verbose) { … … 1518 1508 Vector<BasicBlock*> landingBlocks; 1519 1509 1520 // We ma keforce this true if we give up on inlining any of the edges.1510 // We may force this true if we give up on inlining any of the edges. 1521 1511 bool couldTakeSlowPath = callLinkStatus.couldTakeSlowPath(); 1522 1512 … … 1534 1524 1535 1525 bool inliningResult = attemptToInlineCall( 1536 myCallTargetNode, resultOperand, callLinkStatus[i] .callee(), registerOffset,1526 myCallTargetNode, resultOperand, callLinkStatus[i], registerOffset, 1537 1527 argumentCountIncludingThis, nextOffset, kind, CallerLinksManually, prediction, 1538 1528 inliningBalance); … … 1553 1543 JSCell* thingToCaseOn; 1554 1544 if (allAreDirectCalls) 1555 thingToCaseOn = callLinkStatus[i]. callee().nonExecutableCallee();1545 thingToCaseOn = callLinkStatus[i].nonExecutableCallee(); 1556 1546 else { 1557 1547 ASSERT(allAreClosureCalls); 1558 thingToCaseOn = callLinkStatus[i]. callee().executable();1548 thingToCaseOn = callLinkStatus[i].executable(); 1559 1549 } 1560 1550 data.cases.append(SwitchCase(m_graph.freeze(thingToCaseOn), block.get())); … … 1568 1558 1569 1559 if (verbose) 1570 dataLog("Finished inlining ", callLinkStatus[i] .callee(), " at ", currentCodeOrigin(), ".\n");1560 dataLog("Finished inlining ", callLinkStatus[i], " at ", currentCodeOrigin(), ".\n"); 1571 1561 } 1572 1562 -
trunk/Source/JavaScriptCore/dfg/DFGClobberize.h
r176836 r179357 364 364 case Call: 365 365 case Construct: 366 case ProfiledCall:367 case ProfiledConstruct:368 366 case NativeCall: 369 367 case NativeConstruct: -
trunk/Source/JavaScriptCore/dfg/DFGConstantFoldingPhase.cpp
r175411 r179357 421 421 } 422 422 423 case ProfiledCall:424 case ProfiledConstruct: {425 if (!m_state.forNode(m_graph.varArgChild(node, 0)).m_value)426 break;427 428 // If we were able to prove that the callee is a constant then the normal call429 // inline cache will record this callee. This means that there is no need to do any430 // additional profiling.431 m_interpreter.execute(indexInBlock);432 node->setOp(node->op() == ProfiledCall ? Call : Construct);433 eliminated = true;434 break;435 }436 437 423 default: 438 424 break; -
trunk/Source/JavaScriptCore/dfg/DFGDoesGC.cpp
r176836 r179357 119 119 case NativeCall: 120 120 case NativeConstruct: 121 case ProfiledCall:122 case ProfiledConstruct:123 121 case Breakpoint: 124 122 case ProfileWillCall: -
trunk/Source/JavaScriptCore/dfg/DFGDriver.cpp
r174167 r179357 80 80 vm.getCTIStub(linkCallThunkGenerator); 81 81 vm.getCTIStub(linkConstructThunkGenerator); 82 vm.getCTIStub(link ClosureCallThunkGenerator);82 vm.getCTIStub(linkPolymorphicCallThunkGenerator); 83 83 vm.getCTIStub(virtualCallThunkGenerator); 84 84 vm.getCTIStub(virtualConstructThunkGenerator); … … 86 86 vm.getCTIStub(linkCallThatPreservesRegsThunkGenerator); 87 87 vm.getCTIStub(linkConstructThatPreservesRegsThunkGenerator); 88 vm.getCTIStub(link ClosureCallThatPreservesRegsThunkGenerator);88 vm.getCTIStub(linkPolymorphicCallThatPreservesRegsThunkGenerator); 89 89 vm.getCTIStub(virtualCallThatPreservesRegsThunkGenerator); 90 90 vm.getCTIStub(virtualConstructThatPreservesRegsThunkGenerator); 91 91 } 92 92 93 if (CallEdgeLog::isEnabled())94 vm.ensureCallEdgeLog().processLog();95 96 93 if (vm.typeProfiler()) 97 94 vm.typeProfilerLog()->processLogEntries(ASCIILiteral("Preparing for DFG compilation.")); -
trunk/Source/JavaScriptCore/dfg/DFGFixupPhase.cpp
r177146 r179357 1203 1203 case Call: 1204 1204 case Construct: 1205 case ProfiledCall:1206 case ProfiledConstruct:1207 1205 case ProfileControlFlow: 1208 1206 case NativeCall: -
trunk/Source/JavaScriptCore/dfg/DFGNode.h
r176836 r179357 1058 1058 case Call: 1059 1059 case Construct: 1060 case ProfiledCall:1061 case ProfiledConstruct:1062 1060 case NativeCall: 1063 1061 case NativeConstruct: -
trunk/Source/JavaScriptCore/dfg/DFGNodeType.h
r176836 r179357 218 218 macro(Call, NodeResultJS | NodeMustGenerate | NodeHasVarArgs | NodeClobbersWorld) \ 219 219 macro(Construct, NodeResultJS | NodeMustGenerate | NodeHasVarArgs | NodeClobbersWorld) \ 220 macro(ProfiledCall, NodeResultJS | NodeMustGenerate | NodeHasVarArgs | NodeClobbersWorld) \221 macro(ProfiledConstruct, NodeResultJS | NodeMustGenerate | NodeHasVarArgs | NodeClobbersWorld) \222 220 macro(NativeCall, NodeResultJS | NodeMustGenerate | NodeHasVarArgs | NodeClobbersWorld) \ 223 221 macro(NativeConstruct, NodeResultJS | NodeMustGenerate | NodeHasVarArgs | NodeClobbersWorld) \ -
trunk/Source/JavaScriptCore/dfg/DFGOperations.cpp
r178928 r179357 1220 1220 CodeBlock* codeBlock = exec->codeBlock(); 1221 1221 1222 if (codeBlock->jitType() != JITCode::DFGJIT) { 1223 dataLog("Unexpected code block in DFG->FTL tier-up: ", *codeBlock, "\n"); 1224 RELEASE_ASSERT_NOT_REACHED(); 1225 } 1226 1222 1227 JITCode* jitCode = codeBlock->jitCode()->dfg(); 1223 1228 … … 1238 1243 DeferGC deferGC(vm->heap); 1239 1244 CodeBlock* codeBlock = exec->codeBlock(); 1245 1246 if (codeBlock->jitType() != JITCode::DFGJIT) { 1247 dataLog("Unexpected code block in DFG->FTL tier-up: ", *codeBlock, "\n"); 1248 RELEASE_ASSERT_NOT_REACHED(); 1249 } 1240 1250 1241 1251 JITCode* jitCode = codeBlock->jitCode()->dfg(); -
trunk/Source/JavaScriptCore/dfg/DFGPredictionPropagationPhase.cpp
r176836 r179357 189 189 case Call: 190 190 case Construct: 191 case ProfiledCall:192 case ProfiledConstruct:193 191 case NativeCall: 194 192 case NativeConstruct: -
trunk/Source/JavaScriptCore/dfg/DFGSafeToExecute.h
r176836 r179357 190 190 case Call: 191 191 case Construct: 192 case ProfiledCall:193 case ProfiledConstruct:194 192 case NewObject: 195 193 case NewArray: -
trunk/Source/JavaScriptCore/dfg/DFGSpeculativeJIT32_64.cpp
r178856 r179357 639 639 void SpeculativeJIT::emitCall(Node* node) 640 640 { 641 bool isCall = node->op() == Call || node->op() == ProfiledCall;641 bool isCall = node->op() == Call; 642 642 if (!isCall) 643 ASSERT(node->op() == Construct || node->op() == ProfiledConstruct);643 ASSERT(node->op() == Construct); 644 644 645 645 // For constructors, the this argument is not passed but we have to make space … … 690 690 CallLinkInfo* info = m_jit.codeBlock()->addCallLinkInfo(); 691 691 692 if (node->op() == ProfiledCall || node->op() == ProfiledConstruct) {693 m_jit.vm()->callEdgeLog->emitLogCode(694 m_jit, info->callEdgeProfile, callee.jsValueRegs());695 }696 697 692 slowPath.append(branchNotCell(callee.jsValueRegs())); 698 693 slowPath.append(m_jit.branchPtrWithPatch(MacroAssembler::NotEqual, calleePayloadGPR, targetToCheck)); … … 4170 4165 case Call: 4171 4166 case Construct: 4172 case ProfiledCall:4173 case ProfiledConstruct:4174 4167 emitCall(node); 4175 4168 break; -
trunk/Source/JavaScriptCore/dfg/DFGSpeculativeJIT64.cpp
r178856 r179357 625 625 void SpeculativeJIT::emitCall(Node* node) 626 626 { 627 bool isCall = node->op() == Call || node->op() == ProfiledCall;627 bool isCall = node->op() == Call; 628 628 if (!isCall) 629 DFG_ASSERT(m_jit.graph(), node, node->op() == Construct || node->op() == ProfiledConstruct);629 DFG_ASSERT(m_jit.graph(), node, node->op() == Construct); 630 630 631 631 // For constructors, the this argument is not passed but we have to make space … … 670 670 CallLinkInfo* callLinkInfo = m_jit.codeBlock()->addCallLinkInfo(); 671 671 672 if (node->op() == ProfiledCall || node->op() == ProfiledConstruct) {673 m_jit.vm()->callEdgeLog->emitLogCode(674 m_jit, callLinkInfo->callEdgeProfile, JSValueRegs(calleeGPR));675 }676 677 672 slowPath = m_jit.branchPtrWithPatch(MacroAssembler::NotEqual, calleeGPR, targetToCheck, MacroAssembler::TrustedImmPtr(0)); 678 673 … … 4237 4232 case Call: 4238 4233 case Construct: 4239 case ProfiledCall:4240 case ProfiledConstruct:4241 4234 emitCall(node); 4242 4235 break; -
trunk/Source/JavaScriptCore/dfg/DFGTierUpCheckInjectionPhase.cpp
r173069 r179357 51 51 return false; 52 52 53 if (m_graph.m_profiledBlock->m_didFailFTLCompilation) { 54 removeFTLProfiling(); 53 if (m_graph.m_profiledBlock->m_didFailFTLCompilation) 55 54 return false; 56 }57 55 58 56 #if ENABLE(FTL_JIT) 59 57 FTL::CapabilityLevel level = FTL::canCompile(m_graph); 60 if (level == FTL::CannotCompile) { 61 removeFTLProfiling(); 58 if (level == FTL::CannotCompile) 62 59 return false; 63 }64 60 65 61 if (!Options::enableOSREntryToFTL()) … … 123 119 #endif // ENABLE(FTL_JIT) 124 120 } 125 126 private:127 void removeFTLProfiling()128 {129 for (BlockIndex blockIndex = m_graph.numBlocks(); blockIndex--;) {130 BasicBlock* block = m_graph.block(blockIndex);131 if (!block)132 continue;133 134 for (unsigned nodeIndex = 0; nodeIndex < block->size(); ++nodeIndex) {135 Node* node = block->at(nodeIndex);136 switch (node->op()) {137 case ProfiledCall:138 node->setOp(Call);139 break;140 141 case ProfiledConstruct:142 node->setOp(Construct);143 break;144 145 default:146 break;147 }148 }149 }150 }151 121 }; 152 122 -
trunk/Source/JavaScriptCore/ftl/FTLCapabilities.cpp
r176625 r179357 177 177 // These are OK. 178 178 break; 179 case ProfiledCall:180 case ProfiledConstruct:181 // These are OK not because the FTL can support them, but because if the DFG sees one of182 // these then the FTL will see a normal Call/Construct.183 break;184 179 case Identity: 185 180 // No backend handles this because it will be optimized out. But we may check -
trunk/Source/JavaScriptCore/heap/Heap.cpp
r179348 r179357 994 994 } 995 995 996 if (vm()->callEdgeLog) {997 DeferGCForAWhile awhile(*this);998 vm()->callEdgeLog->processLog();999 }1000 1001 996 RELEASE_ASSERT(!m_deferralDepth); 1002 997 ASSERT(vm()->currentThreadIsHoldingAPILock()); -
trunk/Source/JavaScriptCore/jit/BinarySwitch.h
r179223 r179357 55 55 // unsigned index = switch.caseIndex(); // index into casesVector, above 56 56 // ... // generate code for this case 57 // ... = jit.jump(); // you have to jump out yourself; falling through causes undefined behavior 57 58 // } 58 59 // switch.fallThrough().link(&jit); -
trunk/Source/JavaScriptCore/jit/JITCall.cpp
r178856 r179357 215 215 CallLinkInfo* info = m_codeBlock->addCallLinkInfo(); 216 216 217 if (CallEdgeLog::isEnabled() && shouldEmitProfiling()218 && Options::baselineDoesCallEdgeProfiling())219 m_vm->ensureCallEdgeLog().emitLogCode(*this, info->callEdgeProfile, JSValueRegs(regT0));220 221 217 if (opcodeID == op_call_eval) { 222 218 compileCallEval(instruction); -
trunk/Source/JavaScriptCore/jit/JITCall32_64.cpp
r178856 r179357 301 301 CallLinkInfo* info = m_codeBlock->addCallLinkInfo(); 302 302 303 if (CallEdgeLog::isEnabled() && shouldEmitProfiling()304 && Options::baselineDoesCallEdgeProfiling()) {305 m_vm->ensureCallEdgeLog().emitLogCode(306 *this, info->callEdgeProfile, JSValueRegs(regT1, regT0));307 }308 309 303 if (opcodeID == op_call_eval) { 310 304 compileCallEval(instruction); -
trunk/Source/JavaScriptCore/jit/JITOperations.cpp
r178928 r179357 1 1 /* 2 * Copyright (C) 2013 , 2014Apple Inc. All rights reserved.2 * Copyright (C) 2013-2015 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 777 777 } 778 778 779 static bool attemptToOptimizeClosureCall( 780 ExecState* execCallee, RegisterPreservationMode registers, JSCell* calleeAsFunctionCell, 781 CallLinkInfo& callLinkInfo) 782 { 783 if (!calleeAsFunctionCell) 784 return false; 785 786 JSFunction* callee = jsCast<JSFunction*>(calleeAsFunctionCell); 787 JSFunction* oldCallee = callLinkInfo.callee.get(); 788 789 if (!oldCallee || oldCallee->executable() != callee->executable()) 790 return false; 791 792 ASSERT(callee->executable()->hasJITCodeForCall()); 793 MacroAssemblerCodePtr codePtr = 794 callee->executable()->generatedJITCodeForCall()->addressForCall( 795 *execCallee->callerFrame()->codeBlock()->vm(), callee->executable(), 796 ArityCheckNotRequired, registers); 797 798 CodeBlock* codeBlock; 799 if (callee->executable()->isHostFunction()) 800 codeBlock = 0; 801 else { 802 codeBlock = jsCast<FunctionExecutable*>(callee->executable())->codeBlockForCall(); 803 if (execCallee->argumentCountIncludingThis() < static_cast<size_t>(codeBlock->numParameters()) || callLinkInfo.callType == CallLinkInfo::CallVarargs || callLinkInfo.callType == CallLinkInfo::ConstructVarargs) 804 return false; 805 } 806 807 linkClosureCall( 808 execCallee, callLinkInfo, codeBlock, callee->executable(), codePtr, registers); 809 810 return true; 811 } 812 813 char* JIT_OPERATION operationLinkClosureCall(ExecState* execCallee, CallLinkInfo* callLinkInfo) 779 char* JIT_OPERATION operationLinkPolymorphicCall(ExecState* execCallee, CallLinkInfo* callLinkInfo) 814 780 { 815 781 JSCell* calleeAsFunctionCell; 816 782 char* result = virtualForWithFunction(execCallee, CodeForCall, RegisterPreservationNotRequired, calleeAsFunctionCell); 817 783 818 if (!attemptToOptimizeClosureCall(execCallee, RegisterPreservationNotRequired, calleeAsFunctionCell, *callLinkInfo)) 819 linkSlowFor(execCallee, *callLinkInfo, CodeForCall, RegisterPreservationNotRequired); 784 linkPolymorphicCall(execCallee, *callLinkInfo, CallVariant(calleeAsFunctionCell), RegisterPreservationNotRequired); 820 785 821 786 return result; … … 832 797 } 833 798 834 char* JIT_OPERATION operationLink ClosureCallThatPreservesRegs(ExecState* execCallee, CallLinkInfo* callLinkInfo)799 char* JIT_OPERATION operationLinkPolymorphicCallThatPreservesRegs(ExecState* execCallee, CallLinkInfo* callLinkInfo) 835 800 { 836 801 JSCell* calleeAsFunctionCell; 837 802 char* result = virtualForWithFunction(execCallee, CodeForCall, MustPreserveRegisters, calleeAsFunctionCell); 838 803 839 if (!attemptToOptimizeClosureCall(execCallee, MustPreserveRegisters, calleeAsFunctionCell, *callLinkInfo)) 840 linkSlowFor(execCallee, *callLinkInfo, CodeForCall, MustPreserveRegisters); 804 linkPolymorphicCall(execCallee, *callLinkInfo, CallVariant(calleeAsFunctionCell), MustPreserveRegisters); 841 805 842 806 return result; … … 1034 998 1035 999 CodeBlock* codeBlock = exec->codeBlock(); 1000 if (codeBlock->jitType() != JITCode::BaselineJIT) { 1001 dataLog("Unexpected code block in Baseline->DFG tier-up: ", *codeBlock, "\n"); 1002 RELEASE_ASSERT_NOT_REACHED(); 1003 } 1036 1004 1037 1005 if (bytecodeIndex) { -
trunk/Source/JavaScriptCore/jit/JITOperations.h
r178143 r179357 247 247 EncodedJSValue JIT_OPERATION operationCallEval(ExecState*, ExecState*) WTF_INTERNAL; 248 248 char* JIT_OPERATION operationLinkCall(ExecState*, CallLinkInfo*) WTF_INTERNAL; 249 char* JIT_OPERATION operationLink ClosureCall(ExecState*, CallLinkInfo*) WTF_INTERNAL;249 char* JIT_OPERATION operationLinkPolymorphicCall(ExecState*, CallLinkInfo*) WTF_INTERNAL; 250 250 char* JIT_OPERATION operationVirtualCall(ExecState*, CallLinkInfo*) WTF_INTERNAL; 251 251 char* JIT_OPERATION operationVirtualConstruct(ExecState*, CallLinkInfo*) WTF_INTERNAL; 252 252 char* JIT_OPERATION operationLinkConstruct(ExecState*, CallLinkInfo*) WTF_INTERNAL; 253 253 char* JIT_OPERATION operationLinkCallThatPreservesRegs(ExecState*, CallLinkInfo*) WTF_INTERNAL; 254 char* JIT_OPERATION operationLink ClosureCallThatPreservesRegs(ExecState*, CallLinkInfo*) WTF_INTERNAL;254 char* JIT_OPERATION operationLinkPolymorphicCallThatPreservesRegs(ExecState*, CallLinkInfo*) WTF_INTERNAL; 255 255 char* JIT_OPERATION operationVirtualCallThatPreservesRegs(ExecState*, CallLinkInfo*) WTF_INTERNAL; 256 256 char* JIT_OPERATION operationVirtualConstructThatPreservesRegs(ExecState*, CallLinkInfo*) WTF_INTERNAL; … … 392 392 } 393 393 394 inline P_JITOperation_ECli operationLink ClosureCallFor(RegisterPreservationMode registers)394 inline P_JITOperation_ECli operationLinkPolymorphicCallFor(RegisterPreservationMode registers) 395 395 { 396 396 switch (registers) { 397 397 case RegisterPreservationNotRequired: 398 return operationLink ClosureCall;398 return operationLinkPolymorphicCall; 399 399 case MustPreserveRegisters: 400 return operationLink ClosureCallThatPreservesRegs;400 return operationLinkPolymorphicCallThatPreservesRegs; 401 401 } 402 402 RELEASE_ASSERT_NOT_REACHED(); -
trunk/Source/JavaScriptCore/jit/JITStubRoutine.h
r166218 r179357 142 142 } 143 143 144 // Return true if you are still valid after. Return false if you are now invalid. If you return 145 // false, you will usually not do any clearing because the idea is that you will simply be 146 // destroyed. 144 147 virtual bool visitWeak(RepatchBuffer&); 145 148 -
trunk/Source/JavaScriptCore/jit/JITWriteBarrier.h
r163576 r179357 32 32 #include "SlotVisitor.h" 33 33 #include "UnusedPointer.h" 34 #include "VM.h" 34 35 #include "WriteBarrier.h" 35 36 -
trunk/Source/JavaScriptCore/jit/Repatch.cpp
r178928 r179357 30 30 31 31 #include "AccessorCallJITStubRoutine.h" 32 #include "BinarySwitch.h" 32 33 #include "CCallHelpers.h" 33 34 #include "DFGOperations.h" … … 1576 1577 1577 1578 static void linkSlowFor( 1579 RepatchBuffer& repatchBuffer, VM* vm, CallLinkInfo& callLinkInfo, ThunkGenerator generator) 1580 { 1581 repatchBuffer.relink( 1582 callLinkInfo.callReturnLocation, vm->getCTIStub(generator).code()); 1583 } 1584 1585 static void linkSlowFor( 1578 1586 RepatchBuffer& repatchBuffer, VM* vm, CallLinkInfo& callLinkInfo, 1579 1587 CodeSpecializationKind kind, RegisterPreservationMode registers) 1580 1588 { 1581 repatchBuffer.relink( 1582 callLinkInfo.callReturnLocation, 1583 vm->getCTIStub(virtualThunkGeneratorFor(kind, registers)).code()); 1589 linkSlowFor(repatchBuffer, vm, callLinkInfo, virtualThunkGeneratorFor(kind, registers)); 1584 1590 } 1585 1591 … … 1593 1599 CodeBlock* callerCodeBlock = exec->callerFrame()->codeBlock(); 1594 1600 1595 // If you're being call-linked from a DFG caller then you obviously didn't get inlined.1596 if (calleeCodeBlock && JITCode::isOptimizingJIT(callerCodeBlock->jitType()))1597 calleeCodeBlock->m_shouldAlwaysBeInlined = false;1598 1599 1601 VM* vm = callerCodeBlock->vm(); 1600 1602 … … 1612 1614 1613 1615 if (kind == CodeForCall) { 1614 repatchBuffer.relink(callLinkInfo.callReturnLocation, vm->getCTIStub(linkClosureCallThunkGeneratorFor(registers)).code()); 1616 linkSlowFor( 1617 repatchBuffer, vm, callLinkInfo, linkPolymorphicCallThunkGeneratorFor(registers)); 1615 1618 return; 1616 1619 } … … 1632 1635 } 1633 1636 1634 void linkClosureCall( 1635 ExecState* exec, CallLinkInfo& callLinkInfo, CodeBlock* calleeCodeBlock, 1636 ExecutableBase* executable, MacroAssemblerCodePtr codePtr, 1637 RegisterPreservationMode registers) 1638 { 1639 ASSERT(!callLinkInfo.stub); 1637 static void revertCall( 1638 RepatchBuffer& repatchBuffer, VM* vm, CallLinkInfo& callLinkInfo, ThunkGenerator generator) 1639 { 1640 repatchBuffer.revertJumpReplacementToBranchPtrWithPatch( 1641 RepatchBuffer::startOfBranchPtrWithPatchOnRegister(callLinkInfo.hotPathBegin), 1642 static_cast<MacroAssembler::RegisterID>(callLinkInfo.calleeGPR), 0); 1643 linkSlowFor(repatchBuffer, vm, callLinkInfo, generator); 1644 callLinkInfo.hasSeenShouldRepatch = false; 1645 callLinkInfo.callee.clear(); 1646 callLinkInfo.stub.clear(); 1647 if (callLinkInfo.isOnList()) 1648 callLinkInfo.remove(); 1649 } 1650 1651 void unlinkFor( 1652 RepatchBuffer& repatchBuffer, CallLinkInfo& callLinkInfo, 1653 CodeSpecializationKind kind, RegisterPreservationMode registers) 1654 { 1655 if (Options::showDisassembly()) 1656 dataLog("Unlinking call from ", callLinkInfo.callReturnLocation, " in request from ", pointerDump(repatchBuffer.codeBlock()), "\n"); 1657 1658 revertCall( 1659 repatchBuffer, repatchBuffer.codeBlock()->vm(), callLinkInfo, 1660 linkThunkGeneratorFor(kind, registers)); 1661 } 1662 1663 void linkVirtualFor( 1664 ExecState* exec, CallLinkInfo& callLinkInfo, 1665 CodeSpecializationKind kind, RegisterPreservationMode registers) 1666 { 1667 // FIXME: We could generate a virtual call stub here. This would lead to faster virtual calls 1668 // by eliminating the branch prediction bottleneck inside the shared virtual call thunk. 1640 1669 1641 1670 CodeBlock* callerCodeBlock = exec->callerFrame()->codeBlock(); 1642 1671 VM* vm = callerCodeBlock->vm(); 1672 1673 if (shouldShowDisassemblyFor(callerCodeBlock)) 1674 dataLog("Linking virtual call at ", *callerCodeBlock, " ", exec->callerFrame()->codeOrigin(), "\n"); 1675 1676 RepatchBuffer repatchBuffer(callerCodeBlock); 1677 revertCall(repatchBuffer, vm, callLinkInfo, virtualThunkGeneratorFor(kind, registers)); 1678 } 1679 1680 namespace { 1681 struct CallToCodePtr { 1682 CCallHelpers::Call call; 1683 MacroAssemblerCodePtr codePtr; 1684 }; 1685 } // annonymous namespace 1686 1687 void linkPolymorphicCall( 1688 ExecState* exec, CallLinkInfo& callLinkInfo, CallVariant newVariant, 1689 RegisterPreservationMode registers) 1690 { 1691 // Currently we can't do anything for non-function callees. 1692 // https://bugs.webkit.org/show_bug.cgi?id=140685 1693 if (!newVariant || !newVariant.executable()) { 1694 linkVirtualFor(exec, callLinkInfo, CodeForCall, registers); 1695 return; 1696 } 1697 1698 CodeBlock* callerCodeBlock = exec->callerFrame()->codeBlock(); 1699 VM* vm = callerCodeBlock->vm(); 1700 1701 CallVariantList list; 1702 if (PolymorphicCallStubRoutine* stub = callLinkInfo.stub.get()) 1703 list = stub->variants(); 1704 else if (JSFunction* oldCallee = callLinkInfo.callee.get()) 1705 list = CallVariantList{ CallVariant(oldCallee) }; 1706 1707 list = variantListWithVariant(list, newVariant); 1708 1709 // If there are any closure calls then it makes sense to treat all of them as closure calls. 1710 // This makes switching on callee cheaper. It also produces profiling that's easier on the DFG; 1711 // the DFG doesn't really want to deal with a combination of closure and non-closure callees. 1712 bool isClosureCall = false; 1713 for (CallVariant variant : list) { 1714 if (variant.isClosureCall()) { 1715 list = despecifiedVariantList(list); 1716 isClosureCall = true; 1717 break; 1718 } 1719 } 1720 1721 Vector<PolymorphicCallCase> callCases; 1722 1723 // Figure out what our cases are. 1724 for (CallVariant variant : list) { 1725 CodeBlock* codeBlock; 1726 if (variant.executable()->isHostFunction()) 1727 codeBlock = nullptr; 1728 else { 1729 codeBlock = jsCast<FunctionExecutable*>(variant.executable())->codeBlockForCall(); 1730 1731 // If we cannot handle a callee, assume that it's better for this whole thing to be a 1732 // virtual call. 1733 if (exec->argumentCountIncludingThis() < static_cast<size_t>(codeBlock->numParameters()) || callLinkInfo.callType == CallLinkInfo::CallVarargs || callLinkInfo.callType == CallLinkInfo::ConstructVarargs) { 1734 linkVirtualFor(exec, callLinkInfo, CodeForCall, registers); 1735 return; 1736 } 1737 } 1738 1739 callCases.append(PolymorphicCallCase(variant, codeBlock)); 1740 } 1741 1742 // If we are over the limit, just use a normal virtual call. 1743 unsigned maxPolymorphicCallVariantListSize; 1744 if (callerCodeBlock->jitType() == JITCode::topTierJIT()) 1745 maxPolymorphicCallVariantListSize = Options::maxPolymorphicCallVariantListSizeForTopTier(); 1746 else 1747 maxPolymorphicCallVariantListSize = Options::maxPolymorphicCallVariantListSize(); 1748 if (list.size() > maxPolymorphicCallVariantListSize) { 1749 linkVirtualFor(exec, callLinkInfo, CodeForCall, registers); 1750 return; 1751 } 1643 1752 1644 1753 GPRReg calleeGPR = static_cast<GPRReg>(callLinkInfo.calleeGPR); … … 1656 1765 okArgumentCount.link(&stubJit); 1657 1766 } 1767 1768 GPRReg scratch = AssemblyHelpers::selectScratchGPR(calleeGPR); 1769 GPRReg comparisonValueGPR; 1770 1771 if (isClosureCall) { 1772 // Verify that we have a function and stash the executable in scratch. 1658 1773 1659 1774 #if USE(JSVALUE64) 1660 // We can safely clobber everything except the calleeGPR. We can't rely on tagMaskRegister 1661 // being set. So we do this the hard way. 1662 GPRReg scratch = AssemblyHelpers::selectScratchGPR(calleeGPR); 1663 stubJit.move(MacroAssembler::TrustedImm64(TagMask), scratch); 1664 slowPath.append(stubJit.branchTest64(CCallHelpers::NonZero, calleeGPR, scratch)); 1775 // We can safely clobber everything except the calleeGPR. We can't rely on tagMaskRegister 1776 // being set. So we do this the hard way. 1777 stubJit.move(MacroAssembler::TrustedImm64(TagMask), scratch); 1778 slowPath.append(stubJit.branchTest64(CCallHelpers::NonZero, calleeGPR, scratch)); 1665 1779 #else 1666 // We would have already checked that the callee is a cell. 1667 #endif 1668 1669 slowPath.append( 1670 stubJit.branch8( 1671 CCallHelpers::NotEqual, 1672 CCallHelpers::Address(calleeGPR, JSCell::typeInfoTypeOffset()), 1673 CCallHelpers::TrustedImm32(JSFunctionType))); 1674 1675 slowPath.append( 1676 stubJit.branchPtr( 1677 CCallHelpers::NotEqual, 1780 // We would have already checked that the callee is a cell. 1781 #endif 1782 1783 slowPath.append( 1784 stubJit.branch8( 1785 CCallHelpers::NotEqual, 1786 CCallHelpers::Address(calleeGPR, JSCell::typeInfoTypeOffset()), 1787 CCallHelpers::TrustedImm32(JSFunctionType))); 1788 1789 stubJit.loadPtr( 1678 1790 CCallHelpers::Address(calleeGPR, JSFunction::offsetOfExecutable()), 1679 CCallHelpers::TrustedImmPtr(executable))); 1680 1681 AssemblyHelpers::Call call = stubJit.nearCall(); 1682 AssemblyHelpers::Jump done = stubJit.jump(); 1791 scratch); 1792 1793 comparisonValueGPR = scratch; 1794 } else 1795 comparisonValueGPR = calleeGPR; 1796 1797 Vector<int64_t> caseValues(callCases.size()); 1798 Vector<CallToCodePtr> calls(callCases.size()); 1799 std::unique_ptr<uint32_t[]> fastCounts; 1800 1801 if (callerCodeBlock->jitType() != JITCode::topTierJIT()) 1802 fastCounts = std::make_unique<uint32_t[]>(callCases.size()); 1803 1804 for (size_t i = callCases.size(); i--;) { 1805 if (fastCounts) 1806 fastCounts[i] = 0; 1807 1808 CallVariant variant = callCases[i].variant(); 1809 if (isClosureCall) 1810 caseValues[i] = bitwise_cast<intptr_t>(variant.executable()); 1811 else 1812 caseValues[i] = bitwise_cast<intptr_t>(variant.function()); 1813 } 1814 1815 GPRReg fastCountsBaseGPR = 1816 AssemblyHelpers::selectScratchGPR(calleeGPR, comparisonValueGPR, GPRInfo::regT3); 1817 stubJit.move(CCallHelpers::TrustedImmPtr(fastCounts.get()), fastCountsBaseGPR); 1818 1819 BinarySwitch binarySwitch(comparisonValueGPR, caseValues, BinarySwitch::IntPtr); 1820 CCallHelpers::JumpList done; 1821 while (binarySwitch.advance(stubJit)) { 1822 size_t caseIndex = binarySwitch.caseIndex(); 1823 1824 CallVariant variant = callCases[caseIndex].variant(); 1825 1826 ASSERT(variant.executable()->hasJITCodeForCall()); 1827 MacroAssemblerCodePtr codePtr = 1828 variant.executable()->generatedJITCodeForCall()->addressForCall( 1829 *vm, variant.executable(), ArityCheckNotRequired, registers); 1830 1831 if (fastCounts) { 1832 stubJit.add32( 1833 CCallHelpers::TrustedImm32(1), 1834 CCallHelpers::Address(fastCountsBaseGPR, caseIndex * sizeof(uint32_t))); 1835 } 1836 calls[caseIndex].call = stubJit.nearCall(); 1837 calls[caseIndex].codePtr = codePtr; 1838 done.append(stubJit.jump()); 1839 } 1683 1840 1684 1841 slowPath.link(&stubJit); 1842 binarySwitch.fallThrough().link(&stubJit); 1685 1843 stubJit.move(calleeGPR, GPRInfo::regT0); 1686 1844 #if USE(JSVALUE32_64) … … 1692 1850 stubJit.restoreReturnAddressBeforeReturn(GPRInfo::regT4); 1693 1851 AssemblyHelpers::Jump slow = stubJit.jump(); 1694 1852 1695 1853 LinkBuffer patchBuffer(*vm, stubJit, callerCodeBlock); 1696 1854 1697 patchBuffer.link(call, FunctionPtr(codePtr.executableAddress())); 1855 RELEASE_ASSERT(callCases.size() == calls.size()); 1856 for (CallToCodePtr callToCodePtr : calls) { 1857 patchBuffer.link( 1858 callToCodePtr.call, FunctionPtr(callToCodePtr.codePtr.executableAddress())); 1859 } 1698 1860 if (JITCode::isOptimizingJIT(callerCodeBlock->jitType())) 1699 1861 patchBuffer.link(done, callLinkInfo.callReturnLocation.labelAtOffset(0)); 1700 1862 else 1701 1863 patchBuffer.link(done, callLinkInfo.hotPathOther.labelAtOffset(0)); 1702 patchBuffer.link(slow, CodeLocationLabel(vm->getCTIStub( virtualThunkGeneratorFor(CodeForCall,registers)).code()));1703 1704 RefPtr< ClosureCallStubRoutine> stubRoutine = adoptRef(new ClosureCallStubRoutine(1864 patchBuffer.link(slow, CodeLocationLabel(vm->getCTIStub(linkPolymorphicCallThunkGeneratorFor(registers)).code())); 1865 1866 RefPtr<PolymorphicCallStubRoutine> stubRoutine = adoptRef(new PolymorphicCallStubRoutine( 1705 1867 FINALIZE_CODE_FOR( 1706 1868 callerCodeBlock, patchBuffer, 1707 (" Closure call stub for %s, return point %p, target %p (%s)",1869 ("Polymorphic call stub for %s, return point %p, targets %s", 1708 1870 toCString(*callerCodeBlock).data(), callLinkInfo.callReturnLocation.labelAtOffset(0).executableAddress(), 1709 codePtr.executableAddress(), toCString(pointerDump(calleeCodeBlock)).data())), 1710 *vm, callerCodeBlock->ownerExecutable(), executable)); 1871 toCString(listDump(callCases)).data())), 1872 *vm, callerCodeBlock->ownerExecutable(), exec->callerFrame(), callLinkInfo, callCases, 1873 WTF::move(fastCounts))); 1711 1874 1712 1875 RepatchBuffer repatchBuffer(callerCodeBlock); … … 1715 1878 RepatchBuffer::startOfBranchPtrWithPatchOnRegister(callLinkInfo.hotPathBegin), 1716 1879 CodeLocationLabel(stubRoutine->code().code())); 1880 // This is weird. The original slow path should no longer be reachable. 1717 1881 linkSlowFor(repatchBuffer, vm, callLinkInfo, CodeForCall, registers); 1718 1882 1883 // If there had been a previous stub routine, that one will die as soon as the GC runs and sees 1884 // that it's no longer on stack. 1719 1885 callLinkInfo.stub = stubRoutine.release(); 1720 1886 1721 ASSERT(!calleeCodeBlock || calleeCodeBlock->isIncomingCallAlreadyLinked(&callLinkInfo)); 1887 // The call link info no longer has a call cache apart from the jump to the polymorphic call 1888 // stub. 1889 if (callLinkInfo.isOnList()) 1890 callLinkInfo.remove(); 1722 1891 } 1723 1892 -
trunk/Source/JavaScriptCore/jit/Repatch.h
r178441 r179357 1 1 /* 2 * Copyright (C) 2011 Apple Inc. All rights reserved.2 * Copyright (C) 2011, 2015 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 30 30 31 31 #include "CCallHelpers.h" 32 #include "CallVariant.h" 32 33 #include "JITOperations.h" 33 34 … … 42 43 void linkFor(ExecState*, CallLinkInfo&, CodeBlock*, JSFunction* callee, MacroAssemblerCodePtr, CodeSpecializationKind, RegisterPreservationMode); 43 44 void linkSlowFor(ExecState*, CallLinkInfo&, CodeSpecializationKind, RegisterPreservationMode); 44 void linkClosureCall(ExecState*, CallLinkInfo&, CodeBlock*, ExecutableBase*, MacroAssemblerCodePtr, RegisterPreservationMode); 45 void unlinkFor(RepatchBuffer&, CallLinkInfo&, CodeSpecializationKind, RegisterPreservationMode); 46 void linkVirtualFor(ExecState*, CallLinkInfo&, CodeSpecializationKind, RegisterPreservationMode); 47 void linkPolymorphicCall(ExecState*, CallLinkInfo&, CallVariant, RegisterPreservationMode); 45 48 void resetGetByID(RepatchBuffer&, StructureStubInfo&); 46 49 void resetPutByID(RepatchBuffer&, StructureStubInfo&); -
trunk/Source/JavaScriptCore/jit/ThunkGenerators.cpp
r178856 r179357 138 138 } 139 139 140 static MacroAssemblerCodeRef link ClosureCallForThunkGenerator(140 static MacroAssemblerCodeRef linkPolymorphicCallForThunkGenerator( 141 141 VM* vm, RegisterPreservationMode registers) 142 142 { 143 143 CCallHelpers jit(vm); 144 144 145 slowPathFor(jit, vm, operationLink ClosureCallFor(registers));145 slowPathFor(jit, vm, operationLinkPolymorphicCallFor(registers)); 146 146 147 147 LinkBuffer patchBuffer(*vm, jit, GLOBAL_THUNK_ID); 148 return FINALIZE_CODE(patchBuffer, ("Link closurecall %s slow path thunk", registers == MustPreserveRegisters ? " that preserves registers" : ""));148 return FINALIZE_CODE(patchBuffer, ("Link polymorphic call %s slow path thunk", registers == MustPreserveRegisters ? " that preserves registers" : "")); 149 149 } 150 150 151 151 // For closure optimizations, we only include calls, since if you're using closures for 152 152 // object construction then you're going to lose big time anyway. 153 MacroAssemblerCodeRef link ClosureCallThunkGenerator(VM* vm)154 { 155 return link ClosureCallForThunkGenerator(vm, RegisterPreservationNotRequired);156 } 157 158 MacroAssemblerCodeRef link ClosureCallThatPreservesRegsThunkGenerator(VM* vm)159 { 160 return link ClosureCallForThunkGenerator(vm, MustPreserveRegisters);153 MacroAssemblerCodeRef linkPolymorphicCallThunkGenerator(VM* vm) 154 { 155 return linkPolymorphicCallForThunkGenerator(vm, RegisterPreservationNotRequired); 156 } 157 158 MacroAssemblerCodeRef linkPolymorphicCallThatPreservesRegsThunkGenerator(VM* vm) 159 { 160 return linkPolymorphicCallForThunkGenerator(vm, MustPreserveRegisters); 161 161 } 162 162 -
trunk/Source/JavaScriptCore/jit/ThunkGenerators.h
r171362 r179357 66 66 } 67 67 68 MacroAssemblerCodeRef link ClosureCallThunkGenerator(VM*);69 MacroAssemblerCodeRef link ClosureCallThatPreservesRegsThunkGenerator(VM*);68 MacroAssemblerCodeRef linkPolymorphicCallThunkGenerator(VM*); 69 MacroAssemblerCodeRef linkPolymorphicCallThatPreservesRegsThunkGenerator(VM*); 70 70 71 inline ThunkGenerator link ClosureCallThunkGeneratorFor(RegisterPreservationMode registers)71 inline ThunkGenerator linkPolymorphicCallThunkGeneratorFor(RegisterPreservationMode registers) 72 72 { 73 73 switch (registers) { 74 74 case RegisterPreservationNotRequired: 75 return link ClosureCallThunkGenerator;75 return linkPolymorphicCallThunkGenerator; 76 76 case MustPreserveRegisters: 77 return link ClosureCallThatPreservesRegsThunkGenerator;77 return linkPolymorphicCallThatPreservesRegsThunkGenerator; 78 78 } 79 79 RELEASE_ASSERT_NOT_REACHED(); -
trunk/Source/JavaScriptCore/llint/LLIntSlowPaths.cpp
r178928 r179357 333 333 } 334 334 default: 335 dataLog("Unexpected code block in LLInt: ", *codeBlock, "\n"); 335 336 RELEASE_ASSERT_NOT_REACHED(); 336 337 return false; -
trunk/Source/JavaScriptCore/runtime/Options.h
r177029 r179357 171 171 v(bool, enablePolymorphicAccessInlining, true) \ 172 172 v(bool, enablePolymorphicCallInlining, true) \ 173 v(bool, callStatusShouldUseCallEdgeProfile, true) \ 174 v(bool, callEdgeProfileReallyProcessesLog, true) \ 175 v(bool, baselineDoesCallEdgeProfiling, false) \ 176 v(bool, dfgDoesCallEdgeProfiling, true) \ 177 v(bool, enableCallEdgeProfiling, true) \ 173 v(unsigned, maxPolymorphicCallVariantListSize, 15) \ 174 v(unsigned, maxPolymorphicCallVariantListSizeForTopTier, 5) \ 175 v(unsigned, maxPolymorphicCallVariantsForInlining, 5) \ 178 176 v(unsigned, frequentCallThreshold, 2) \ 177 v(double, minimumCallToKnownRate, 0.51) \ 179 178 v(bool, optimizeNativeCalls, false) \ 180 179 v(bool, enableObjectAllocationSinking, true) \ -
trunk/Source/JavaScriptCore/runtime/VM.cpp
r177222 r179357 370 370 } 371 371 372 CallEdgeLog& VM::ensureCallEdgeLog()373 {374 if (!callEdgeLog)375 callEdgeLog = std::make_unique<CallEdgeLog>();376 return *callEdgeLog;377 }378 379 372 #if ENABLE(JIT) 380 373 static ThunkGenerator thunkGeneratorForIntrinsic(Intrinsic intrinsic) … … 460 453 void VM::prepareToDiscardCode() 461 454 { 462 if (callEdgeLog)463 callEdgeLog->processLog();464 465 455 #if ENABLE(DFG_JIT) 466 456 for (unsigned i = DFG::numberOfWorklists(); i--;) { -
trunk/Source/JavaScriptCore/runtime/VM.h
r177222 r179357 74 74 class ArityCheckFailReturnThunks; 75 75 class BuiltinExecutables; 76 class CallEdgeLog;77 76 class CodeBlock; 78 77 class CodeCache; … … 238 237 std::unique_ptr<DFG::LongLivedState> dfgState; 239 238 #endif // ENABLE(DFG_JIT) 240 241 std::unique_ptr<CallEdgeLog> callEdgeLog;242 CallEdgeLog& ensureCallEdgeLog();243 239 244 240 VMType vmType;
Note:
See TracChangeset
for help on using the changeset viewer.