I have 2 same web pages which can be accessed via different URLS, I need to check that the contents on both the web pages are same
for example if I run the tool against http://www.google.co.in/ and http://www.google.co.uk/
the only digfference is in the image
I know of just one tool for now HTML match which can do such comparisons.
Are there any other tools preferable open source which can perform similar task
[ QUOTE ]
Any tool where i can enter the web links and then the tool can give me the diefferences?
[/ QUOTE ]
I don't know of a tool with that as its sole purpose.
I have written many scripts using automated test tools (WinRunner in the past, WinTask now) to do that.
Some of our current automated regression tests use that model as part of the process.
- A list of URLs is used as input
- each page is scraped
- the current page is compared to a baseline of the same page
- any differences are note, and dumped into a folder for later analysis
We also use a variant of that model, where we compare the page in the system under test to the equivalent page in production (rather than a saved baseline).