You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Have you considered increasing this to, say, two or four times an hour? I doubt it would make a big dent in the total amount of traffic that the CRAN server sees. It might even help decrease the traffic by moving someone who's tracking their package manually to looking at CRANsays instead - once an hour is not enough for such use.
UPDATE: I see that https://nx10.github.io/cransubs/ is updated once every ten minutes.
UPDATE 2: It's updated only when someone access it, and I guess at most every 10 minutes.
The text was updated successfully, but these errors were encountered:
Thanks for the report! I agree this would be desirable.
It needs to be followed by a series of performance improvements to avoid any unnecessary resource consumption if we increase the frequency.
My main unresolved question at the moment is how to consider historical data saving in this. Should we still save all occurrences (related to #52)? Or keep every hour with, e.g., a counter that will only save historical data every other or fourth run? 🤔
Currently, the CRAN incoming FTP server is polled once an hour:
cransays/.github/workflows/render-dashboard.yml
Lines 10 to 11 in b0cc818
Have you considered increasing this to, say, two or four times an hour? I doubt it would make a big dent in the total amount of traffic that the CRAN server sees. It might even help decrease the traffic by moving someone who's tracking their package manually to looking at CRANsays instead - once an hour is not enough for such use.
UPDATE: I see that https://nx10.github.io/cransubs/ is updated once every ten minutes.
UPDATE 2: It's updated only when someone access it, and I guess at most every 10 minutes.
The text was updated successfully, but these errors were encountered: