-
Notifications
You must be signed in to change notification settings - Fork 524
Problems writing data #612
Comments
Here's my client: |
5.2.0 was released a day after you reported this, try that one. Also try 5.0.0 and see if it is equally slow. |
We also experience massive performance problems on writing large-ish data frames with version 5.1.0. One example (see below) uses 5 seconds on my local laptop to write a million lines (one column) with version 5.0.0, but on version 5.1.0 it takes minutes (in fact I gave up before it finished). On the current master the performance is back to 5.0.0 performance. The pypi current latest version 5.2.0 is broken (see #616), so I'll use 5.0.0 or master for now. I guess the fix in #617 will come to the pip-world soon though. I have no idea what the cause is, maybe something wrong with our setup of InfluxDB itself perhaps... The below example produces this output for master and similar for 5.0.0: But for version 5.1.0 this is how it looks: Example code used (Tried on both 1.5.X and 1.6.0 InfluxDB instance):
|
I am a bit disappointed and surprised by the speed I observe when writing to my influxdb. However, I am probably doing something very wrong and would appreciate any pointers into the correct direction.
I am using the latest docker-image.
I have a DataFrame with 10 columns and 10000 rows, e.g. 100 000 data points. To write them into the database took me like 30 seconds! I am running Ubuntu on a 16GB RAM machine with a SSD drive
I write column per column (all using the same measurement but using the name of the column as tag), e.g.
I have tried other methods (e.g. the SeriesHelper etc.) but the speed has never really picked up
Here's the decisive fragment from my own client (I inherit from your standard client)
The text was updated successfully, but these errors were encountered: