Open
Description
Hi all, thank you for your great work.
I have a feature request: It would be interesting to implement the interactive mode (-i option) that is available in llama.cpp, in order to run the starchat-alpha fine-tuned version of the model. I have tested it with the prompt option and works properly, but it would be more useful with the interactive mode.
Thanks in advance!
Best,
Jordi