You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There was a case where a user imported data causing millions of rows to be inserted into a table that had webhooks enabled. This caused high CPU usage for a prolonged period of time.
Proposal
With an in-memory queue, it's possible to bound it to a certain size. Once this size is surpassed, we could:
Block the producers of http requests, until there's more capacity in the queue.
Log an ERROR or WARNING.
Note
Spill over to disk is not an option (inserting into another table), as it would make the usage more complex.
The text was updated successfully, but these errors were encountered:
@steve-chavez hello
My net table once had 100,000 rows piled up, which forced me to divert some network requests and call through my own service.
http1234--> pg_net
http5678--> my_pg_net, AND i have a service to read my_pg_net and send http
Problem
There was a case where a user imported data causing millions of rows to be inserted into a table that had webhooks enabled. This caused high CPU usage for a prolonged period of time.
Proposal
With an in-memory queue, it's possible to bound it to a certain size. Once this size is surpassed, we could:
Note
Spill over to disk is not an option (inserting into another table), as it would make the usage more complex.
The text was updated successfully, but these errors were encountered: