I want to develop a new website where content should be dynamic. This content has to be fed with content on several websites (login required). Unfortunately the other websites don't provide APIs to retrieve this data automatically. This content is however hidden and seems to be loaded in the html elements with javascript. I've done some research and web scrapers are one of the solutions suggested. I'm curious on how the following can be best achieved:
Retrieve content regularly from the protected website, trigger could be a mail received in my inbox
Is there a better solution than web scrapers?
Secondly since I'm not familiar with web development I'm interested in what kind of techniques are best to develop this website with.
Does Wordpress suffice?
Does Wordpress have the functionality to retrieve (hidden) data from other protected websites?
Is it possible to use e-mail as a trigger to retrieve this data?
I think the above mentioned requirements need customised development. If this is the case, which programming language is required besides the obvious standards (html/css)?
I know from a long time ago that PHP was best practice, is this still the case?
The goal is to build a website where content management is mostly done automatically. I'm looking for some advice just to know which way I should head into.
Thanks in advance!
Kind regards,
Sef





