source: webkit/trunk/JavaScriptCore/VM/CodeBlock.h@ 37891

Last change on this file since 37891 was 37891, checked in by [email protected], 17 years ago

2008-10-25 Geoffrey Garen <[email protected]>

Reviewed by Sam Weinig, with Gavin Barraclough's help.


Fixed Sampling Tool:

  • Made CodeBlock sampling work with CTI
  • Improved accuracy by unifying most sampling data into a single 32bit word, which can be written / read atomically.
  • Split out three different #ifdefs for modularity: OPCODE_SAMPLING; CODEBLOCK_SAMPLING; OPCODE_STATS.
  • Improved reporting clarity
  • Refactored for code clarity
  • VM/CTI.cpp: (JSC::CTI::emitCTICall): (JSC::CTI::compileOpCall): (JSC::CTI::emitSlowScriptCheck): (JSC::CTI::compileBinaryArithOpSlowCase): (JSC::CTI::privateCompileMainPass): (JSC::CTI::privateCompileSlowCases): (JSC::CTI::privateCompile):
  • VM/CTI.h: Updated CTI codegen to use the unified SamplingTool interface for encoding samples. (This required passing the current vPC to a lot more functions, since the unified interface samples the current vPC.) Added hooks for writing the current CodeBlock* on function entry and after a function call, for the sake of the CodeBlock sampler. Removed obsolete hook for clearing the current sample inside op_end. Also removed the custom enum used to differentiate flavors of op_call, since the OpcodeID enum works just as well. (This was important in an earlier version of the patch, but now it's just cleanup.)
  • VM/CodeBlock.cpp: (JSC::CodeBlock::lineNumberForVPC):
  • VM/CodeBlock.h: Upated for refactored #ifdefs. Changed lineNumberForVPC to be robust against vPCs not recorded for exception handling, since the Sampler may ask for an arbitrary vPC.
  • VM/Machine.cpp: (JSC::Machine::execute): (JSC::Machine::privateExecute): (JSC::Machine::cti_op_call_NotJSFunction): (JSC::Machine::cti_op_construct_NotJSConstruct):
  • VM/Machine.h: (JSC::Machine::setSampler): (JSC::Machine::sampler): (JSC::Machine::jitCodeBuffer): Upated for refactored #ifdefs. Changed Machine to use SamplingTool helper objects to record movement in and out of host code. This makes samples a bit more precise.


  • VM/Opcode.cpp: (JSC::OpcodeStats::~OpcodeStats):
  • VM/Opcode.h: Upated for refactored #ifdefs. Added a little more padding, to accomodate our more verbose opcode names.
  • VM/SamplingTool.cpp: (JSC::ScopeSampleRecord::sample): Only count a sample toward our total if we actually record it. This solves cases where a CodeBlock will claim to have been sampled many times, with reported samples that don't match.

(JSC::SamplingTool::run): Read the current sample into a Sample helper
object, to ensure that the data doesn't change while we're analyzing it,
and to help decode the data. Only access the CodeBlock sampling hash
table if CodeBlock sampling has been enabled, so non-CodeBlock sampling
runs can operate with even less overhead.

(JSC::SamplingTool::dump): I reorganized this code a lot to print the
most important info at the top, print as a table, annotate and document
the stuff I didn't understand when I started, etc.

  • VM/SamplingTool.h: New helper classes, described above.
  • kjs/Parser.h:
  • kjs/Shell.cpp: (runWithScripts):
  • kjs/nodes.cpp: (JSC::ScopeNode::ScopeNode): Updated for new sampling APIs.
  • wtf/Platform.h: Moved sampling #defines here, since our custom is to put ENABLE #defines into Platform.h. Made explicit the fact that CODEBLOCK_SAMPLING depends on OPCODE_SAMPLING.
File size: 13.6 KB
Line 
1/*
2 * Copyright (C) 2008 Apple Inc. All rights reserved.
3 * Copyright (C) 2008 Cameron Zwarich <[email protected]>
4 *
5 * Redistribution and use in source and binary forms, with or without
6 * modification, are permitted provided that the following conditions
7 * are met:
8 *
9 * 1. Redistributions of source code must retain the above copyright
10 * notice, this list of conditions and the following disclaimer.
11 * 2. Redistributions in binary form must reproduce the above copyright
12 * notice, this list of conditions and the following disclaimer in the
13 * documentation and/or other materials provided with the distribution.
14 * 3. Neither the name of Apple Computer, Inc. ("Apple") nor the names of
15 * its contributors may be used to endorse or promote products derived
16 * from this software without specific prior written permission.
17 *
18 * THIS SOFTWARE IS PROVIDED BY APPLE AND ITS CONTRIBUTORS "AS IS" AND ANY
19 * EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
20 * WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
21 * DISCLAIMED. IN NO EVENT SHALL APPLE OR ITS CONTRIBUTORS BE LIABLE FOR ANY
22 * DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
23 * (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
24 * LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
25 * ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
26 * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
27 * THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
28 */
29
30#ifndef CodeBlock_h
31#define CodeBlock_h
32
33#include "Instruction.h"
34#include "JSGlobalObject.h"
35#include "nodes.h"
36#include "Parser.h"
37#include "SourceRange.h"
38#include "ustring.h"
39#include <wtf/RefPtr.h>
40#include <wtf/Vector.h>
41
42namespace JSC {
43
44 class ExecState;
45
46 enum CodeType { GlobalCode, EvalCode, FunctionCode };
47
48 static ALWAYS_INLINE int missingThisObjectMarker() { return std::numeric_limits<int>::max(); }
49
50 struct HandlerInfo {
51 uint32_t start;
52 uint32_t end;
53 uint32_t target;
54 uint32_t scopeDepth;
55 void* nativeCode;
56 };
57
58 struct ExpressionRangeInfo {
59 enum {
60 MaxOffset = (1 << 7) - 1,
61 MaxDivot = (1 << 25) - 1
62 };
63 uint32_t instructionOffset : 25;
64 uint32_t divotPoint : 25;
65 uint32_t startOffset : 7;
66 uint32_t endOffset : 7;
67 };
68
69 struct LineInfo {
70 uint32_t instructionOffset;
71 int32_t lineNumber;
72 };
73
74 struct OffsetLocation {
75 int32_t branchOffset;
76#if ENABLE(CTI)
77 void* ctiOffset;
78#endif
79 };
80
81 struct StructureStubInfo {
82 StructureStubInfo(unsigned opcodeIndex)
83 : opcodeIndex(opcodeIndex)
84 , stubRoutine(0)
85 , callReturnLocation(0)
86 , hotPathBegin(0)
87 {
88 }
89
90 unsigned opcodeIndex;
91 void* stubRoutine;
92 void* callReturnLocation;
93 void* hotPathBegin;
94 };
95
96 struct CallLinkInfo {
97 CallLinkInfo()
98 : callReturnLocation(0)
99 , hotPathBegin(0)
100 , hotPathOther(0)
101 , coldPathOther(0)
102 , callee(0)
103 {
104 }
105
106 unsigned opcodeIndex;
107 void* callReturnLocation;
108 void* hotPathBegin;
109 void* hotPathOther;
110 void* coldPathOther;
111 CodeBlock* callee;
112 unsigned position;
113
114 void setUnlinked() { callee = 0; }
115 bool isLinked() { return callee; }
116 };
117
118 inline void* getStructureStubInfoReturnLocation(StructureStubInfo* structureStubInfo)
119 {
120 return structureStubInfo->callReturnLocation;
121 }
122
123 // Binary chop algorithm, calls valueAtPosition on pre-sorted elements in array,
124 // compares result with key (KeyTypes should be comparable with '--', '<', '>').
125 // Optimized for cases where the array contains the key, checked by assertions.
126 template<typename ArrayType, typename KeyType, KeyType(*valueAtPosition)(ArrayType*)>
127 inline ArrayType* binaryChop(ArrayType* array, size_t size, KeyType key)
128 {
129 // The array must contain at least one element (pre-condition, array does conatin key).
130 // If the array only contains one element, no need to do the comparison.
131 while (size > 1) {
132 // Pick an element to check, half way through the array, and read the value.
133 int pos = (size - 1) >> 1;
134 KeyType val = valueAtPosition(&array[pos]);
135
136 // If the key matches, success!
137 if (val == key)
138 return &array[pos];
139 // The item we are looking for is smaller than the item being check; reduce the value of 'size',
140 // chopping off the right hand half of the array.
141 else if (key < val)
142 size = pos;
143 // Discard all values in the left hand half of the array, up to and including the item at pos.
144 else {
145 size -= (pos + 1);
146 array += (pos + 1);
147 }
148
149 // 'size' should never reach zero.
150 ASSERT(size);
151 }
152
153 // If we reach this point we've chopped down to one element, no need to check it matches
154 ASSERT(size == 1);
155 ASSERT(key == valueAtPosition(&array[0]));
156 return &array[0];
157 }
158
159 struct StringJumpTable {
160 typedef HashMap<RefPtr<UString::Rep>, OffsetLocation> StringOffsetTable;
161 StringOffsetTable offsetTable;
162#if ENABLE(CTI)
163 void* ctiDefault; // FIXME: it should not be necessary to store this.
164#endif
165
166 inline int32_t offsetForValue(UString::Rep* value, int32_t defaultOffset)
167 {
168 StringOffsetTable::const_iterator end = offsetTable.end();
169 StringOffsetTable::const_iterator loc = offsetTable.find(value);
170 if (loc == end)
171 return defaultOffset;
172 return loc->second.branchOffset;
173 }
174
175#if ENABLE(CTI)
176 inline void* ctiForValue(UString::Rep* value)
177 {
178 StringOffsetTable::const_iterator end = offsetTable.end();
179 StringOffsetTable::const_iterator loc = offsetTable.find(value);
180 if (loc == end)
181 return ctiDefault;
182 return loc->second.ctiOffset;
183 }
184#endif
185 };
186
187 struct SimpleJumpTable {
188 // FIXME: The two Vectors can be combind into one Vector<OffsetLocation>
189 Vector<int32_t> branchOffsets;
190 int32_t min;
191#if ENABLE(CTI)
192 Vector<void*> ctiOffsets;
193 void* ctiDefault;
194#endif
195
196 int32_t offsetForValue(int32_t value, int32_t defaultOffset);
197 void add(int32_t key, int32_t offset)
198 {
199 if (!branchOffsets[key])
200 branchOffsets[key] = offset;
201 }
202
203#if ENABLE(CTI)
204 inline void* ctiForValue(int32_t value)
205 {
206 if (value >= min && static_cast<uint32_t>(value - min) < ctiOffsets.size())
207 return ctiOffsets[value - min];
208 return ctiDefault;
209 }
210#endif
211 };
212
213 class EvalCodeCache {
214 public:
215 PassRefPtr<EvalNode> get(ExecState* exec, const UString& evalSource, ScopeChainNode* scopeChain, JSValue*& exceptionValue)
216 {
217 RefPtr<EvalNode> evalNode;
218
219 if (evalSource.size() < maxCacheableSourceLength && (*scopeChain->begin())->isVariableObject())
220 evalNode = cacheMap.get(evalSource.rep());
221
222 if (!evalNode) {
223 int errLine;
224 UString errMsg;
225
226 SourceCode source = makeSource(evalSource);
227 evalNode = exec->globalData().parser->parse<EvalNode>(exec, exec->dynamicGlobalObject()->debugger(), source, &errLine, &errMsg);
228 if (evalNode) {
229 if (evalSource.size() < maxCacheableSourceLength && (*scopeChain->begin())->isVariableObject() && cacheMap.size() < maxCacheEntries)
230 cacheMap.set(evalSource.rep(), evalNode);
231 } else {
232 exceptionValue = Error::create(exec, SyntaxError, errMsg, errLine, source.provider()->asID(), NULL);
233 return 0;
234 }
235 }
236
237 return evalNode.release();
238 }
239
240 private:
241 static const int maxCacheableSourceLength = 256;
242 static const int maxCacheEntries = 64;
243
244 HashMap<RefPtr<UString::Rep>, RefPtr<EvalNode> > cacheMap;
245 };
246
247 struct CodeBlock {
248 CodeBlock(ScopeNode* ownerNode_, CodeType codeType_, PassRefPtr<SourceProvider> source_, unsigned sourceOffset_)
249 : ownerNode(ownerNode_)
250 , globalData(0)
251#if ENABLE(CTI)
252 , ctiCode(0)
253#endif
254 , numCalleeRegisters(0)
255 , numConstants(0)
256 , numVars(0)
257 , numParameters(0)
258 , needsFullScopeChain(ownerNode_->needsActivation())
259 , usesEval(ownerNode_->usesEval())
260 , codeType(codeType_)
261 , source(source_)
262 , sourceOffset(sourceOffset_)
263 {
264 ASSERT(source);
265 }
266
267 ~CodeBlock();
268
269#if ENABLE(CTI)
270 void unlinkCallers();
271#endif
272
273 void addCaller(CallLinkInfo* caller)
274 {
275 caller->callee = this;
276 caller->position = linkedCallerList.size();
277 linkedCallerList.append(caller);
278 }
279
280 void removeCaller(CallLinkInfo* caller)
281 {
282 unsigned pos = caller->position;
283 unsigned lastPos = linkedCallerList.size() - 1;
284
285 if (pos != lastPos) {
286 linkedCallerList[pos] = linkedCallerList[lastPos];
287 linkedCallerList[pos]->position = pos;
288 }
289 linkedCallerList.shrink(lastPos);
290 }
291
292#if !defined(NDEBUG) || ENABLE_OPCODE_SAMPLING
293 void dump(ExecState*) const;
294 void printStructureIDs(const Instruction*) const;
295 void printStructureID(const char* name, const Instruction*, int operand) const;
296#endif
297 int expressionRangeForVPC(const Instruction*, int& divot, int& startOffset, int& endOffset);
298 int lineNumberForVPC(const Instruction* vPC);
299 bool getHandlerForVPC(const Instruction* vPC, Instruction*& target, int& scopeDepth);
300 void* nativeExceptionCodeForHandlerVPC(const Instruction* handlerVPC);
301
302 void mark();
303 void refStructureIDs(Instruction* vPC) const;
304 void derefStructureIDs(Instruction* vPC) const;
305
306 StructureStubInfo& getStubInfo(void* returnAddress)
307 {
308 return *(binaryChop<StructureStubInfo, void*, getStructureStubInfoReturnLocation>(propertyAccessInstructions.begin(), propertyAccessInstructions.size(), returnAddress));
309 }
310
311 ScopeNode* ownerNode;
312 JSGlobalData* globalData;
313#if ENABLE(CTI)
314 void* ctiCode;
315#endif
316
317 int numCalleeRegisters;
318
319 // NOTE: numConstants holds the number of constant registers allocated
320 // by the code generator, not the number of constant registers used.
321 // (Duplicate constants are uniqued during code generation, and spare
322 // constant registers may be allocated.)
323 int numConstants;
324 int numVars;
325 int numParameters;
326 int thisRegister;
327 bool needsFullScopeChain;
328 bool usesEval;
329 bool usesArguments;
330 CodeType codeType;
331 RefPtr<SourceProvider> source;
332 unsigned sourceOffset;
333
334 Vector<Instruction> instructions;
335 Vector<unsigned> globalResolveInstructions;
336 Vector<StructureStubInfo> propertyAccessInstructions;
337 Vector<CallLinkInfo> callLinkInfos;
338 Vector<CallLinkInfo*> linkedCallerList;
339
340 // Constant pool
341 Vector<Identifier> identifiers;
342 Vector<RefPtr<FuncDeclNode> > functions;
343 Vector<RefPtr<FuncExprNode> > functionExpressions;
344 Vector<Register> constantRegisters;
345 Vector<JSValue*> unexpectedConstants;
346 Vector<RefPtr<RegExp> > regexps;
347 Vector<HandlerInfo> exceptionHandlers;
348 Vector<ExpressionRangeInfo> expressionInfo;
349 Vector<LineInfo> lineInfo;
350
351 Vector<SimpleJumpTable> immediateSwitchJumpTables;
352 Vector<SimpleJumpTable> characterSwitchJumpTables;
353 Vector<StringJumpTable> stringSwitchJumpTables;
354
355 HashSet<unsigned, DefaultHash<unsigned>::Hash, WTF::UnsignedWithZeroKeyHashTraits<unsigned> > labels;
356
357#if ENABLE(CTI)
358 HashMap<void*, unsigned> ctiReturnAddressVPCMap;
359#endif
360
361 EvalCodeCache evalCodeCache;
362
363 private:
364#if !defined(NDEBUG) || ENABLE(OPCODE_SAMPLING)
365 void dump(ExecState*, const Vector<Instruction>::const_iterator& begin, Vector<Instruction>::const_iterator&) const;
366#endif
367
368 };
369
370 // Program code is not marked by any function, so we make the global object
371 // responsible for marking it.
372
373 struct ProgramCodeBlock : public CodeBlock {
374 ProgramCodeBlock(ScopeNode* ownerNode_, CodeType codeType_, JSGlobalObject* globalObject_, PassRefPtr<SourceProvider> source_)
375 : CodeBlock(ownerNode_, codeType_, source_, 0)
376 , globalObject(globalObject_)
377 {
378 globalObject->codeBlocks().add(this);
379 }
380
381 ~ProgramCodeBlock()
382 {
383 if (globalObject)
384 globalObject->codeBlocks().remove(this);
385 }
386
387 JSGlobalObject* globalObject; // For program and eval nodes, the global object that marks the constant pool.
388 };
389
390 struct EvalCodeBlock : public ProgramCodeBlock {
391 EvalCodeBlock(ScopeNode* ownerNode_, JSGlobalObject* globalObject_, PassRefPtr<SourceProvider> source_)
392 : ProgramCodeBlock(ownerNode_, EvalCode, globalObject_, source_)
393 {
394 }
395 };
396
397} // namespace JSC
398
399#endif // CodeBlock_h
Note: See TracBrowser for help on using the repository browser.