Reducing memory consumption in EigenDB.
text-embedding-3-small
embedding model to convert text into 1536-dimensional vectors.
In memory, a single 1536-dimensional vector is represented as a list of 1536 float32 values. Meaning that a single vector consumes .
Now let’s assume you have 1,000,000 vectors in your database. The total memory consumption would be: .
And if you wanted to story 1,000,000,000 vectors, the memory consumption would come out to: .
So it is clear that memory consumption can quickly go through the roof when dealing with high-dimensional vectors at large scales.