All requests | > | Link Checker | > | Request new recommendation | > | Featured requests | > | No recommendations |
by Stephen Johns - 2 days ago (2017-08-28)
+2 | I need a way to find broken links in a Web site. |
+1 | by Fernando 60 - 23 hours ago (2017-08-30) Comment I do not think there is a package to handle that. It's basically send a request with the links and analyze the response. Use Curl to accomplish that. |
+1 | by Dave Smith 6395 - 2 days ago (2017-08-28) Comment It is a multi-part process. First you need to scrape the website and retrieve the links, which is fairly easy. Then you can use this class to send http requests to the linked sites and capture the response to check if they are returning a good request. |
1. by Till Wehowski - 22 hours ago (2017-08-30) Reply
I agree with Dave Smith to recommend https://www.phpclasses.org/package/3-PHP-HTTP-client-to-access-Web-site-pages.html for testing the http response code, you can fetch only the headers and check for the response code? To do the first task, fetching the links, I would recommend:
or just a simple REGEX:
$regexp = "<a\s[^>]href=(\"??)([^\" >]?)\\1[^>]>(.)<\/a>"; preg_match_all("/$regexp/siU", $this->content, $matches);
2. by Till Wehowski - 22 hours ago (2017-08-30) in reply to comment 1 by Till Wehowski Reply
Somehow the regex in my answer was broken by the site, here it is as gist gist.github.com/wehowski/afc811cb4eb727e97e2a75b1b9d3e3c6
Recommend package | |
|