OpenAIEmbeddings does not allow setting encoding_format, causing incompatibility with LM Studio (returns float[], not base64) #8221
Labels
auto:bug
Related to a bug, vulnerability, unexpected error with an existing feature
Uh oh!
There was an error while loading. Please reload this page.
Checked other resources
Example Code
WORKS with LM Studio if encoding_format is set manually (value doesn't matter)
DOES NOT WORK with OpenAIEmbeddings – no way to set encoding_format
Error Message and Stack Trace (if applicable)
[Running] node "c:\projects\rag-sample\index.mjs"
✅ Server running at http://localhost:11434
file:///c:/projects/rag-sample/node_modules/@qdrant/openapi-typescript-fetch/dist/esm/fetcher.js:169
throw new fun.Error(err);
^
ApiError: Bad Request
at Object.fun [as searchPoints] (file:///c:/projects/rag-sample/node_modules/@qdrant/openapi-typescript-fetch/dist/esm/fetcher.js:169:23)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async QdrantClient.search (file:///c:/projects/rag-sample/node_modules/@qdrant/js-client-rest/dist/esm/qdrant-client.js:165:26)
at async QdrantVectorStore.similaritySearchVectorWithScore (file:///c:/projects/rag-sample/node_modules/@langchain/qdrant/dist/vectorstores.js:165:25)
at async QdrantVectorStore.similaritySearch (file:///c:/projects/rag-sample/node_modules/@langchain/core/dist/vectorstores.js:260:25)
at async file:///c:/projects/rag-sample/index.mjs:55:29 {
headers: Headers {},
url: 'http://localhost:6333/collections/eu-ai/points/search',
status: 400,
statusText: 'Bad Request',
data: {
status: {
error: 'Wrong input: Vector dimension error: expected dim: 768, got 192'
},
time: 0.001095872
}
}
Node.js v22.14.0
[Done] exited with code=1 in 6.972 seconds
Description
I'm using OpenAIEmbeddings from LangChainJS in combination with LM Studio as the embedding backend. LM Studio provides a fully OpenAI-compatible API, but it always returns raw float[] embeddings, regardless of the
encoding_format
parameter.Here’s the problem:
The only way to make this work is to explicitly set encoding_format in the SDK call and interestingly, it works no matter what value you set ("float" or "base64"), because LM Studio ignores the parameter anyway and always returns float arrays.
However, since
OpenAIEmbeddings
does not allow settingencoding_format
, I had to extend the class with a custom wrapper just to inject this parameter.System Info
@langchain/community: 0.3.43
@langchain/core: 0.3.56
@langchain/openai: 0.5.10
@langchain/qdrant: 0.1.2
Node: v22.14.0
Platform: Windows 11
Embedding backend: LM-Studio (localhost)
The text was updated successfully, but these errors were encountered: