dramaticcat@sh.itjust.works to Lemmy Shitpost@lemmy.world · 1 年前Chad scrapersh.itjust.worksimagemessage-square93fedilinkarrow-up11.04Karrow-down129
arrow-up11.01Karrow-down1imageChad scrapersh.itjust.worksdramaticcat@sh.itjust.works to Lemmy Shitpost@lemmy.world · 1 年前message-square93fedilink
minus-squarebill_1992@lemmy.worldlinkfedilinkarrow-up161arrow-down2·1 年前Everyone loves the idea of scraping, no one likes maintaining scrapers that break once a week because the CSS or HTML changed.
minus-squareAnonymousllama@lemmy.worldlinkfedilinkarrow-up21·1 年前This one. One of the best motivators. Sense of satisfaction when you get it working and you feel unstoppable (until the next subtle changes happens anyway)
minus-squarecamr_on@lemmy.worldlinkfedilinkarrow-up26·1 年前I loved scraping until my ip was blocked for botting lol. I know there’s ways around it it’s just work though
minus-squarePennomi@lemmy.worldlinkfedilinkEnglisharrow-up41·1 年前I successfully scraped millions of Amazon product listings simply by routing through TOR and cycling the exit node every 10 seconds.
minus-squarecamr_on@lemmy.worldlinkfedilinkarrow-up15·1 年前That’s a good idea right there, I like that
minus-squareferret@sh.itjust.workslinkfedilinkEnglisharrow-up4·1 年前lmao, yeah, get all the exit nodes banned from amazon.
minus-squarePennomi@lemmy.worldlinkfedilinkEnglisharrow-up12·1 年前That’s the neat thing, it wouldn’t because traffic only spikes for 10s on any particular node. It perfectly blends into the background noise.
minus-squarenilloc@discuss.tchncs.delinkfedilinkEnglisharrow-up3·1 年前Queue Office Space style error and scrape for 10 hours on each node.
minus-squarecamr_on@lemmy.worldlinkfedilinkarrow-up7·1 年前I’m coding baby’s first bot over here lol, I could probably do better
minus-squaredangblingus@lemmy.worldlinkfedilinkarrow-up10·1 年前Or in the case of wikipedia, every table on successive pages for sequential data is formatted differently.
minus-squareMatriks404@lemmy.worldlinkfedilinkarrow-up8arrow-down1·1 年前Just use AI to make changes ¯_(ツ)_/¯
Everyone loves the idea of scraping, no one likes maintaining scrapers that break once a week because the CSS or HTML changed.
deleted by creator
This one. One of the best motivators. Sense of satisfaction when you get it working and you feel unstoppable (until the next subtle changes happens anyway)
I feel this
I loved scraping until my ip was blocked for botting lol. I know there’s ways around it it’s just work though
I successfully scraped millions of Amazon product listings simply by routing through TOR and cycling the exit node every 10 seconds.
That’s a good idea right there, I like that
This guy scrapes
lmao, yeah, get all the exit nodes banned from amazon.
That’s the neat thing, it wouldn’t because traffic only spikes for 10s on any particular node. It perfectly blends into the background noise.
Queue Office Space style error and scrape for 10 hours on each node.
You guys use IP’s?
Token ring for me baybeee
I’m coding baby’s first bot over here lol, I could probably do better
Or in the case of wikipedia, every table on successive pages for sequential data is formatted differently.
Just use AI to make changes ¯_(ツ)_/¯
Here take these: \\
¯(ツ)/¯\\ Thanks