A smort model made using the cleaned Orca data.

{System Prompt}

Username: {Input}
BotName: {Response}
Username: {Input}
BotName: {Response}

Seriously, I have to add more due to HF Leaderboard requirements. so basically, this model uses a cleaned version of Orca along with my typical RP data package. It was intended as a test to see if the models RP evals would be affected by an overwhelming amount of instruct tokens.

Downloads last month
50
Safetensors
Model size
7B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for cgato/TheSpice-7b-FT-ExperimentalOrca

Quantizations
2 models