ianthonypillos supports this item


This author's response time can be up to 2 business days.

168 comments found.

I sent you mail 2 days ago, but i didn’t get any response from your side?? Regarding Chapters scrapping

used mass scraper instead.. but it could take time to finish all the manga..

Mass scrapper only scrape manga’s cover page and details.. not chapters

it has manga chapter scraper.. please check the demo.

Is there a way to mass scrape nhentai series

yes.. but it will be a customization work.


zetakai Purchased

error scrapper ninemanga spanish


must go out


please help

error in

Storage Mode: Copy URLs without downloading images.

Storage Mode: Local server.



please help

hi, i think. they created a hot link protection image.. ill check on that..


zetakai Purchased

any solution please


zetakai Purchased

Function to download the image to the server


bjonh Purchased

Please update to laravel 5.4 !

im working on it.. its not laravel 5.4 it will be laravel 5.2 coz some of user doesnt have php7 yet..


bjonh Purchased

I’m getting 500 err PHP >= 5.4 <OK> MCrypt PHP Extension <OK> PHP JSON extension <OK>

===========nginx config file.===========

server { server_name www.truyentinh.org; rewrite ^(.*) http://truyentinh.org$1 permanent; } server { listen 80; access_log off; # access_log /home/truyentinh.org/logs/access_log; error_log off; # error_log /home/truyentinh.org/logs/error.log; add_header X-Frame-Options SAMEORIGIN; add_header X-Content-Type-Options nosniff; add_header X-XSS-Protection “1; mode=block”; root /home/truyentinh.org/public_html; include /etc/nginx/conf/ddos2.conf; index index.php index.html index.htm; server_name truyentinh.org; location / { try_files $uri $uri/ /index.php?$query_string; } location ~ \.php$ { fastcgi_split_path_info ^(.\.php)(/.)$; include /etc/nginx/fastcgi_params; fastcgi_pass; fastcgi_index index.php; fastcgi_connect_timeout 250; fastcgi_send_timeout 250; fastcgi_read_timeout 250; fastcgi_buffer_size 256k; fastcgi_buffers 4 256k; fastcgi_busy_buffers_size 256k; fastcgi_temp_file_write_size 256k; fastcgi_intercept_errors on; fastcgi_param SCRIPT_FILENAME /home/truyentinh.org/public_html$fastcgi_script_name; }


your site is working. did you fixed it already?.. it seems server configuration issue.

when will update came out?

if possible can you make the chapter pages also visible on url? so it will be also indexed in search provider. and also try to handle look alike url.

http://mangaoverload.com/manga/Naruto/29/1 instead of http://mangaoverload.com/Naruto/chapter/29

it is easier for user/clients to navigate and also it will have consistency on linking. since the manga page is http://mangaoverload.com/manga/Plus_Tic_Neesan. this will improve seo for sure.

i cannot say the exact time for the update.. but im continue doing it.. yes.. actually thats possible.. you just need to edit the method that triggers that part.

how to get the page number in route::get part?

which method or url your talking about? this url?


miukun Purchased

Hello, Author.

This is nice product. But, I have issue.

Mass Scrape is not working. Please check it.


How to fix it?

Thanks .

hi, please send me login details.. for your site.. in my support tab. so i can help you..


miukun Purchased

Hello, I already send message. When suppor me? I understood you so busy. please help you.

hi, did you sent me your server details?.. so i can check your site.. not the admin in your site.

Hello I’m keep asking for support but i didn’t get any proper support from your side. http://mangareader.eu/ (Mass scraper only scrap manga’s tittle nothing else) http://mangareader.eu/?page=8

Check both links, don’t know what kind of problem with your script???

When i scrap chapters i’m automatically logout from the dashboard.

Hi, you’re using mass scraper right?.. are you saving it in your local or using the image to link url from the site you are scraping?..

scraping chapters occured in server session configuration… i might into look for possible solution for this for my mangaoverload v2..


sipet929 Purchased

Hey are you receiving my emails? If so could you respond?

i replied already.. sorry for the delay.