Consuming rate-limited APIs at scale is like interacting with a database that has pretty painful latency. I keep "inventing" what are surely preexisting design patterns / solutions to this problem. Can anyone point me to a good guide so that I can stop wasting my time?
-
-
I would think you would want to mock the APIs you consume with local services and cache the requests that you make, where caching could even be a local durable DB. You can always add an option to the API to ignore your local cache as well.
-
You can periodically warm your caches so you always have data available even though it’s stale.
- 1 more reply
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
