Skip to content

Commit b70fd3a

Browse files
committed
server : use common_token_to_piece instead of common_detokenize
This commit replaces the call to common_detokenize with common_token_to_piece in the populate_token_probs. The motivation for this change is to avoid an issue where common_detokenize would remove the word boundary character for tokens, which caused a regression in the server generated token probabilities. Resolves: #11728
1 parent d7b31a9 commit b70fd3a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

examples/server/server.cpp

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2297,7 +2297,7 @@ struct server_context {
22972297
for (size_t i = 0; i < std::min(n_vocab, n_probs); i++) {
22982298
result.probs.push_back({
22992299
cur[i].id,
2300-
common_detokenize(ctx, {cur[i].id}, special),
2300+
common_token_to_piece(ctx, cur[i].id, special),
23012301
cur[i].p
23022302
});
23032303
}

0 commit comments

Comments
 (0)