In Seatable script runner I would like to use SQL queries to manage update 60k rows faster. Now I use base.list_rows and base.update_row but it is very slow. Tried to import dtable_db but it seems it is not available. is it not included automatically in latest installation? - As I understand, updates should then work also in BIG Data? Pls any suggestion for managing/updating so huge amount of rows, and 2. to update also BIG Data.
The base.query() method is much better than the list_rows() method. Please use the former to query data. (For updating data, you can use update_row() or query().)
Did you update the Python pipeline? If not, please do. It is possible that you are running a very old version with an outdated Python client.
we are running the latest self-hosted SeaTable server and have just updated both the backend and the Python pipeline.
Our seatable-api version is 3.1.0 (Python client), and the API Gateway is running.
However, we are still unable to use the base.query() method as described in the documentation and by your support team.
For example, when running the following code in our script environment:
from seatable_api import Base
api_token = ‘MY_TOKEN’
server_url = ‘MY_URL’