Skip to content

Commit 9886313

Browse files
authored
server : free llama_batch on exit (#7212)
* [server] Cleanup a memory leak on exit There are a couple memory leaks on exit of the server. This hides others. After cleaning this up, you can see leaks on slots. But that is another patch to be sent after this. * make tab into spaces
1 parent f99e1e4 commit 9886313

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

examples/server/server.cpp

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -673,6 +673,8 @@ struct server_context {
673673
llama_free_model(model);
674674
model = nullptr;
675675
}
676+
677+
llama_batch_free(batch);
676678
}
677679

678680
bool load_model(const gpt_params & params_) {

0 commit comments

Comments
 (0)