-
Hi all, I just read (but not yet applied) the tutorial pages and I have a question. Let's say I have a legacy relational database/application with a REST API. What would be the easiest way to setup an LDES stream for a specific table/object? As I read the docs right now, this would be the easiest way to integrate (and be able to publish continually):
I see an issue with the fact that the endpoint would have to be stateful given how the HttpInPoller works. Ideally it would be stateless by providing a "give me all objects since" parameter. Another way to go would be to publish to a Kafka topic? But this introduces yet another component in the architecture. I think it would be easier to implement it in the way above. Any suggestions? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi @DenEwout, From your post I'm concluding that you don't want to really migrate data from a legacy system to LDES but you want to publish data from the legacy system as LDES. Is this correct? Based in your information, the way with the least work (or no work at all) at legacy side is to scrape the existing API as done in this example: Scraping an API. If you do not want to poll frequently, changes on the legacy side would be necessary to push the changes to the workbench as you have indicated already. |
Beta Was this translation helpful? Give feedback.
Hi @DenEwout,
From your post I'm concluding that you don't want to really migrate data from a legacy system to LDES but you want to publish data from the legacy system as LDES. Is this correct?
Based in your information, the way with the least work (or no work at all) at legacy side is to scrape the existing API as done in this example: Scraping an API.
You would be able to only convert the needed information to linked data in the mapping and drop the information that isn't necessary.
If you do not want to poll frequently, changes on the legacy side would be necessary to push the changes to the workbench as you have indicated already.