Discussion on LinksCatcher - Capture links from any web resource

Manrix

Manrix supports this item

Supported

This author's response time can be up to 2 business days.

9 comments found.

Bought your script, looks good so far. Only problem I’ve come across is that I tried to adjust the config file to let me go 10 layers deep, but the script still only allows 2 layers max. Please advise on a fix.

Yes, i understood. You’ve to change the max attribute value of that input field with the number you want directly in index.html at line 180. It isn’t connected to the config.php file. The value of that input overwrites the value you set in config file. If you want to avoid the overwriting by the user, remove completely that input. If you can’t do it, let me know and i’ll do it for you.

Thanks, that fixed it!

Well. Report me any other problem.

What is the point of capturing links?

Well, it was useful for me, so i thought it could have been useful for others too. Then, i want to test myself, as you can see is the first item i sell.

I gave it a try,so, why status 200 and links 101 only, the websute URL I add is big.

Thanks for feedback. Are you sure the links were different? Because the script filters duplicates. Can you give me the url? Thanks.

Nice work GLWS :)

Thank you. ;)

A pleasure to see on CodeCanyon… Very good work Author, Although I would have no use for this script alongside my projects, I have added you to my favourite Authors. Your design is clean and very professional..

I look forward to seeing more of your work, hopefully in the near future.. :)

I appreciate it. Thank you. :)

Hi, I want to fetch only external links and only homepage (not internal links in same domain)... and then ( for each one ) repeat the same cycle and save them in my db. think may have a cron job… Its like a bot for fetching links and build my database. Is the code of script complex or some average php ( from my side ) is enough to make the necessary changes? Thanks!

Hi. Yes, you can setup the script to do only that. It can be easly done because the script outputs an array you can iterate and insert in database. The script isn’t very complex. Let me know if you need help.

Hi, great.

“script outputs an array” its perfect for me and my average php coder skills.

Setting up a domain when should return soon to buy it.

Thanks for the support.

Hi there, looks like script is missing something. I check one domain with other backlink checker tool and it showing more links than yours. I check it with multiple other different tool, result was vary but yours showing 5 to 10% of what other tools are showing, why is that? Is it possible to get full length URL for inbound and outbound links?

Are you sure they’re unique links ? My script removes duplicates. Can you please make some examples ? Thanks.

Hi Thanks for reply, Check snap.com.au, I checked in these checkers

https://monitorbacklinks.com/seo-tools/free-backlink-checker (links – 244.2K) www. semrush.com (links – 50.8K) http://www.backlinkwatch.com/index.php (links – 9727)

I am interested in your script if its working properly

My script fetches the links from the given urls. You could use it to follow the links and scrape the sites but it would take a lot of time. It’s a general links scraper not specific for backlinks. I don’t know how those sites fetch the links so fast, probably they use google api.

Congratulations! Good luck with your sales ;)

Is it possible to save also the title of the linked page? Is it possible to restrict it to stay within the domain?

by
by
by
by
by
by