Skip to content

txtai load testing #305

@a-bawane

Description

@a-bawane

I did concurrency load testing of search request in jmeter. txtai was not able to handle 10 concurrent users for 2 minutes; many request failed with following attached error. Has anyone done load testing before on txtai? what was your output?
I want to scale txtai to handle at least 50 concurrent users. Is there way to achieve this?
I used standalone as well as cluster configuration for txtai. Both had similar load testing results.

my system configuration:
Processor 11th Gen Intel(R) Core(TM) i7-11800H @ 2.30GHz 2.30 GHz
Installed RAM 16.0 GB (15.7 GB usable)
OS Windows 11
System type 64-bit operating system, x64-based processor

Error:-
2022-07-13 16:40:19,080 [ERROR] run_asgi: Exception in ASGI application
Traceback (most recent call last):
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 366, in run_asgi
result = await app(self.scope, self.receive, self.send)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 75, in call
return await self.app(scope, receive, send)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\applications.py", line 261, in call
await super().call(scope, receive, send)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\applications.py", line 112, in call
await self.middleware_stack(scope, receive, send)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\middleware\errors.py", line 181, in call
raise exc
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\middleware\errors.py", line 159, in call
await self.app(scope, receive, _send)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\exceptions.py", line 82, in call
raise exc
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\exceptions.py", line 71, in call
await self.app(scope, receive, sender)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\middleware\asyncexitstack.py", line 21, in call
raise e
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\middleware\asyncexitstack.py", line 18, in call
await self.app(scope, receive, send)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\routing.py", line 656, in call
await route.handle(scope, receive, send)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\routing.py", line 259, in handle
await self.app(scope, receive, send)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\routing.py", line 61, in app
response = await func(request)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\routing.py", line 227, in app
raw_response = await run_endpoint_function(
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\fastapi\routing.py", line 162, in run_endpoint_function
return await run_in_threadpool(dependant.call, **values)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\starlette\concurrency.py", line 39, in run_in_threadpool
return await anyio.to_thread.run_sync(func, *args)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\anyio\to_thread.py", line 28, in run_sync
return await get_asynclib().run_sync_in_worker_thread(func, *args, cancellable=cancellable,
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\anyio_backends_asyncio.py", line 818, in run_sync_in_worker_thread
return await future
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\anyio_backends_asyncio.py", line 754, in run
result = context.run(func, *args)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\txtai\api\routers\embeddings.py", line 30, in search
return application.get().search(query, request)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\txtai\api\base.py", line 34, in search
return super().search(query, limit)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\txtai\app\base.py", line 263, in search
return [{"id": r[0], "score": float(r[1])} if isinstance(r, tuple) else r for r in self.embeddings.search(query, limit)]
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\txtai\embeddings\base.py", line 285, in search
results = self.batchsearch([query], limit if limit else 3)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\txtai\embeddings\base.py", line 302, in batchsearch
return Search(self)(queries, limit if limit else 3)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\txtai\embeddings\search.py", line 47, in call
return self.dbsearch(queries, limit)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\txtai\embeddings\search.py", line 123, in dbsearch
result = self.database.search(query, [s for i, s in enumerate(search) if i in indices], limit)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\txtai\database\base.py", line 156, in search
where = self.embed(similarity, 0)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\txtai\database\sqlite.py", line 264, in embed
self.batch(indexids=[i for i, _ in similarity[batch]], batch=batch)
File "C:\Users\aksha\AppData\Local\Programs\Python\Python39\lib\site-packages\txtai\database\sqlite.py", line 476, in batch
self.cursor.execute(SQLite.CREATE_BATCH)
sqlite3.ProgrammingError: Recursive use of cursors not allowed.

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions