

I mean, they helpfully provide ranking lists. Start at #1 and work your way down
I mean, they helpfully provide ranking lists. Start at #1 and work your way down
Actually, as to your edit, the it sounds like you’re fine-tuning the model for your data, not training it from scratch. So the llm has seen english and chinese before during the initial training. Also, they represent words as vectors and what usually happens is that similiar words’ vectors are close together. So subtituting e.g. Dad for Papa looks almost the same to an llm. Same across languages. But that’s not understanding, that’s behavior that way simpler models also have.
Though half the time it’s
deleted
thanks that really solved my problem, you’re amazing!
I still have mixed feelings about deleting one’s whole comment history, though i also did that when I left reddit. it’s the right thing to do, but the amount of information lost because of greedy leadership is super sad.
Selling fewer explosive vehicles seems like a good idea tbh.