Skip to content

Bug: --chat-template seems to be broken now, no way to truly chat from the llama-cli #8053

Closed
@Deputation

Description

@Deputation

What happened?

As per discussions:

#7837
#8009

It seems to be impossible to chat with llama3 8b properly. I have not tested this on 70b models but even in the server UI the model just starts making notes to itself and output garbage / training data as to how it should converse instead of actually conversing. Has something happened to the --chat-template chatml parameter? Even when the CLI is set to output special tokens, I do not see the ChatML tokens coming out.

Name and Version

version: 3158 (5239925)

What operating system are you seeing the problem on?

Linux

Relevant log output

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bug-unconfirmedlow severityUsed to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions