- cross-posted to:
- technology@beehaw.org
- technology@lemmy.world
- opensource@lemmy.ml
- cross-posted to:
- technology@beehaw.org
- technology@lemmy.world
- opensource@lemmy.ml
Yep, it hit many lemmy servers as well, including mine. I had to block multiple alibaba subnet to get things back to normal. But I’m expecting the next spam wave.
The Linux Mint forums have been knocked offline multiple times over the last few months, to the point where the admins had to block all Chinese and Brazilian IPs for a while.
This is the first I’ve heard about Brazil in this type of cyber attack. Is it re-routed traffic going there or are there a large number of Brazilian bot farms now?
I don’t know why/how, just know that the admins saw the servers were being overwhelmed by traffic from Brazilian IPs and blocked it for a while.
Removed by mod
nepenthe
It’s a Markov-chain-based text generator which could be difficult for people to implement on repos depending upon how they’re hosting them. Regardless, any sensibly-built crawler will have rate limits. This means that although Nepenthe is an interesting thought exercise, it’s only going to do anything to things knocked together by people who haven’t thought about it, not the Big Big companies with the real resources who are likely having the biggest impact.
Removed by mod
any way of slowing things down or wasting resources is a gain I guess
ELI5 why the AI companies can’t just clone the git repos and do all the slicing and dicing (running
git blame
etc.) locally instead of running expensive queries on the projects’ servers?Takes more effort and results in a static snapshot without being able to track the evolution of the project. (disclaimer: I don’t work with ai, but I’d bet this is the reason and also I don’t intend to defend those scraping twatwaffles in any way, but to offer a possible explanation)
Also having your victim host the costs is an added benefit
deleted by creator
If an AI is detecting bugs, the least it could do is file a pull request, these things are supposed to be master coders right? 🙃
to me, ai is a bit like bucket of water if you replace the water with “information”. Its a tool and it cant do anything on its own, you could make a program and instruct it to do something but it would work just as chaotically as when you generate stuff with ai. It annoys me so much to see so many(people in general) think that what they call ai is in anyway capable of independent action. It just does what you tell it to do and it does it based on how it has been trained, which is also why relying on ai trained by someone you shouldnt trust is bad idea.
Assuming we could build a new internet from the ground up, what would be the solution? IPFS for load-balancing?
There is no technical solution that will stop corporations with deep pockets in a capitalist society
Maybe letters through the mail to receive posts.
so basically what you are saying is to not put information on public places, but only send information to specific people
And only then because the USPS is a federal agency. You can bet if private corporations ran it there would be no such privacy.
How long will USPS last?
Removed by mod
AI will come up there to abuse it as well
They’re afraid