Ignore:
Timestamp:
Apr 22, 2016, 7:00:38 PM (9 years ago)
Author:
[email protected]
Message:

Speed up bound functions a bit
https://bugs.webkit.org/show_bug.cgi?id=156889

Reviewed by Saam Barati.
Source/JavaScriptCore:


Bound functions are hard to optimize because JSC doesn't have a good notion of non-JS code
that does JS-ey things like make JS calls. What I mean by "non-JS code" is code that did not
originate from JS source. A bound function does a highly polymorphic call to the target
stored in the JSBoundFunction. Prior to this change, we represented it as native code that
used the generic native->JS call API. That's not cheap.

We could model bound functions using a builtin, but it's not clear that this would be easy
to grok, since so much of the code would have to access special parts of the JSBoundFunction
type. Doing it that way might solve the performance problems but it would mean extra work to
arrange for the builtin to have speedy access to the call target, the bound this, and the
bound arguments. Also, optimizing bound functions that way would mean that bound function
performance would be gated on the performance of a bunch of other things in our system. For
example, we'd want this polymorphic call to be handled like the funnel that it is: if we're
compiling the bound function's outgoing call with no context then we should compile it as
fully polymorphic but we can let it assume basic sanity like that the callee is a real
function; but if we're compiling the call with any amount of calling context then we want to
use normal call IC's.

Since the builtin path wouldn't lead to a simpler patch and since I think that the VM will
benefit in the long run from using custom handling for bound functions, I kept the native
code and just added Intrinsic/thunk support.

This just adds an Intrinsic for bound function calls where the JSBoundFunction targets a
JSFunction instance and has no bound arguments (only bound this). This intrinsic is
currently only implemented as a thunk and not yet recognized by the DFG bytecode parser.

I needed to loosen some restrictions to do this. For one, I was really tired of our bad use
of ENABLE(JIT) conditionals, which made it so that any serious client of Intrinsics would
have to have #ifdefs. Really what should happen is that if the JIT is not enabled then we
just ignore intrinsics. Also, the code was previously assuming that having a native
constructor and knowing the Intrinsic for your native call were mutually exclusive. This
change makes it possible to have a native executable that has a custom function, custom
constructor, and an Intrinsic.

This is a >4x speed-up on bound function calls with no bound arguments.

In the future, we should teach the DFG Intrinsic handling to deal with bound functions and
we should teach the inliner (and ByteCodeParser::handleCall() in general) how to deal with
the function call inside the bound function. That would be super awesome.

  • assembler/AbstractMacroAssembler.h:

(JSC::AbstractMacroAssembler::timesPtr):
(JSC::AbstractMacroAssembler::Address::withOffset):
(JSC::AbstractMacroAssembler::BaseIndex::BaseIndex):
(JSC::MacroAssemblerType>::Address::indexedBy):

  • jit/AssemblyHelpers.h:

(JSC::AssemblyHelpers::storeCell):
(JSC::AssemblyHelpers::loadCell):
(JSC::AssemblyHelpers::storeValue):
(JSC::AssemblyHelpers::emitSaveCalleeSaves):
(JSC::AssemblyHelpers::emitSaveThenMaterializeTagRegisters):
(JSC::AssemblyHelpers::emitRestoreCalleeSaves):
(JSC::AssemblyHelpers::emitRestoreSavedTagRegisters):
(JSC::AssemblyHelpers::copyCalleeSavesToVMCalleeSavesBuffer):

  • jit/JITThunks.cpp:

(JSC::JITThunks::ctiNativeTailCall):
(JSC::JITThunks::ctiNativeTailCallWithoutSavedTags):
(JSC::JITThunks::ctiStub):
(JSC::JITThunks::hostFunctionStub):
(JSC::JITThunks::clearHostFunctionStubs):

  • jit/JITThunks.h:
  • jit/SpecializedThunkJIT.h:

(JSC::SpecializedThunkJIT::callDoubleToDoublePreservingReturn):
(JSC::SpecializedThunkJIT::tagReturnAsInt32):
(JSC::SpecializedThunkJIT::emitSaveThenMaterializeTagRegisters): Deleted.
(JSC::SpecializedThunkJIT::emitRestoreSavedTagRegisters): Deleted.

  • jit/ThunkGenerators.cpp:

(JSC::virtualThunkFor):
(JSC::nativeForGenerator):
(JSC::nativeCallGenerator):
(JSC::nativeTailCallGenerator):
(JSC::nativeTailCallWithoutSavedTagsGenerator):
(JSC::nativeConstructGenerator):
(JSC::randomThunkGenerator):
(JSC::boundThisNoArgsFunctionCallGenerator):

  • jit/ThunkGenerators.h:
  • runtime/Executable.cpp:

(JSC::NativeExecutable::create):
(JSC::NativeExecutable::destroy):
(JSC::NativeExecutable::createStructure):
(JSC::NativeExecutable::finishCreation):
(JSC::NativeExecutable::NativeExecutable):
(JSC::ScriptExecutable::ScriptExecutable):

  • runtime/Executable.h:
  • runtime/FunctionPrototype.cpp:

(JSC::functionProtoFuncBind):

  • runtime/IntlCollatorPrototype.cpp:

(JSC::IntlCollatorPrototypeGetterCompare):

  • runtime/Intrinsic.h:
  • runtime/JSBoundFunction.cpp:

(JSC::boundThisNoArgsFunctionCall):
(JSC::boundFunctionCall):
(JSC::boundThisNoArgsFunctionConstruct):
(JSC::boundFunctionConstruct):
(JSC::getBoundFunctionStructure):
(JSC::JSBoundFunction::create):
(JSC::JSBoundFunction::customHasInstance):
(JSC::JSBoundFunction::JSBoundFunction):

  • runtime/JSBoundFunction.h:

(JSC::JSBoundFunction::targetFunction):
(JSC::JSBoundFunction::boundThis):
(JSC::JSBoundFunction::boundArgs):
(JSC::JSBoundFunction::createStructure):
(JSC::JSBoundFunction::offsetOfTargetFunction):
(JSC::JSBoundFunction::offsetOfBoundThis):

  • runtime/JSFunction.cpp:

(JSC::JSFunction::lookUpOrCreateNativeExecutable):
(JSC::JSFunction::create):

  • runtime/VM.cpp:

(JSC::thunkGeneratorForIntrinsic):
(JSC::VM::getHostFunction):

  • runtime/VM.h:

(JSC::VM::getCTIStub):
(JSC::VM::exceptionOffset):

LayoutTests:

This microbenchmark speeds up by >4x with this change.

  • js/regress/bound-function-call-expected.txt: Added.
  • js/regress/bound-function-call.html: Added.
  • js/regress/script-tests/bound-function-call.js: Added.

(foo):

File:
1 edited

Legend:

Unmodified
Added
Removed
  • trunk/Source/JavaScriptCore/jit/JITThunks.cpp

    r197815 r199946  
    11/*
    2  * Copyright (C) 2012, 2013, 2015 Apple Inc. All rights reserved.
     2 * Copyright (C) 2012, 2013, 2015-2016 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    6565}
    6666
     67MacroAssemblerCodePtr JITThunks::ctiNativeTailCallWithoutSavedTags(VM* vm)
     68{
     69    ASSERT(vm->canUseJIT());
     70    return ctiStub(vm, nativeTailCallWithoutSavedTagsGenerator).code();
     71}
     72
    6773MacroAssemblerCodeRef JITThunks::ctiStub(VM* vm, ThunkGenerator generator)
    6874{
     
    8591NativeExecutable* JITThunks::hostFunctionStub(VM* vm, NativeFunction function, NativeFunction constructor, const String& name)
    8692{
    87     ASSERT(!isCompilationThread());
    88 
    89     if (NativeExecutable* nativeExecutable = m_hostFunctionStubMap->get(std::make_tuple(function, constructor, name)))
    90         return nativeExecutable;
    91 
    92     NativeExecutable* nativeExecutable = NativeExecutable::create(
    93         *vm,
    94         adoptRef(new NativeJITCode(JIT::compileCTINativeCall(vm, function), JITCode::HostCallThunk)),
    95         function,
    96         adoptRef(new NativeJITCode(MacroAssemblerCodeRef::createSelfManagedCodeRef(ctiNativeConstruct(vm)), JITCode::HostCallThunk)),
    97         constructor, NoIntrinsic, name);
    98     weakAdd(*m_hostFunctionStubMap, std::make_tuple(function, constructor, name), Weak<NativeExecutable>(nativeExecutable, this));
    99     return nativeExecutable;
     93    return hostFunctionStub(vm, function, constructor, nullptr, NoIntrinsic, name);
    10094}
    10195
    102 NativeExecutable* JITThunks::hostFunctionStub(VM* vm, NativeFunction function, ThunkGenerator generator, Intrinsic intrinsic, const String& name)
     96NativeExecutable* JITThunks::hostFunctionStub(VM* vm, NativeFunction function, NativeFunction constructor, ThunkGenerator generator, Intrinsic intrinsic, const String& name)
    10397{
    10498    ASSERT(!isCompilationThread());   
    10599    ASSERT(vm->canUseJIT());
    106100
    107     if (NativeExecutable* nativeExecutable = m_hostFunctionStubMap->get(std::make_tuple(function, &callHostFunctionAsConstructor, name)))
     101    if (NativeExecutable* nativeExecutable = m_hostFunctionStubMap->get(std::make_tuple(function, constructor, name)))
    108102        return nativeExecutable;
    109103
     
    117111    RefPtr<JITCode> forConstruct = adoptRef(new NativeJITCode(MacroAssemblerCodeRef::createSelfManagedCodeRef(ctiNativeConstruct(vm)), JITCode::HostCallThunk));
    118112   
    119     NativeExecutable* nativeExecutable = NativeExecutable::create(*vm, forCall, function, forConstruct, callHostFunctionAsConstructor, intrinsic, name);
    120     weakAdd(*m_hostFunctionStubMap, std::make_tuple(function, &callHostFunctionAsConstructor, name), Weak<NativeExecutable>(nativeExecutable, this));
     113    NativeExecutable* nativeExecutable = NativeExecutable::create(*vm, forCall, function, forConstruct, constructor, intrinsic, name);
     114    weakAdd(*m_hostFunctionStubMap, std::make_tuple(function, constructor, name), Weak<NativeExecutable>(nativeExecutable, this));
    121115    return nativeExecutable;
     116}
     117
     118NativeExecutable* JITThunks::hostFunctionStub(VM* vm, NativeFunction function, ThunkGenerator generator, Intrinsic intrinsic, const String& name)
     119{
     120    return hostFunctionStub(vm, function, callHostFunctionAsConstructor, generator, intrinsic, name);
    122121}
    123122
Note: See TracChangeset for help on using the changeset viewer.