A few quick ones:
- consider setting up keyword interceptors, for example using Moderation Query String.
- turning on email validation to give a little more control on account creation.
- regularly searching for terms that spammers are currently posting about such as "live stream", "Abercrombie and Hollister", "Gucci" etc and sending existing content to moderation (and then deletion - or if you are unlucky the moderator pushes the wrong button and it comes back ).
- IP blocking if the same spammer keeps coming after you.
But it sometimes seems like a full time job. Resist the temptation to comment on or even think about why so much of it seems to come from a particular area of the globe - that would be so un-PC (yes - "Barry" in Australia I'm thinking of you).
If the spammer puts a visually enticing picture of "themselves" on their profile - they probably really are that nice (?)
Full time job is right. But frankly, I don't think it should be. We don't have the resources to do it without taking away from other programs. And it's not as if spam is a new issue for the web or one that other platforms (LinkedIn groups, WordPress blogs, news sites, etc) haven't dealt with. Why hasn't Jive?
Currently one way to get rid of spam is to download all new messages or modified content on a regular basis and scan for external URLs, check the URLs, disable the users and delete the links.
I have never see text only spam without links or external images.
We have users which register and some weeks later they create their first message with "useful" content and add there also a spam link. These are really hard to find.
We do not not delete the users, they are disabled and need a new email address for their next spam. It would be nice to have an option to add a comment why the user was disabled.
As long as links to external pages are followed by search engines we will have this problem. You may also have noticed that SBS needs optimization for search engines. a) or b) would help a lot to get rid of spam which is intended for SEO of other domains.
Some thoughts / options:
a) Modify the links with "-a href="http://www.example.com/" rel="nofollow"- Link text -/a-". This should be done after submitting (Add Reply) on server side. Unfortunately this will block the search engines to follow links to useful pages ...
b) Or add the meta tag nofollow to thread, message, blog and document templates. Anyhow only the first page of threads with a lot of replies (and thus having a link to the second page) may be indexed by public search engines. I don't like this option.
c) Require moderation for every message with external links / images. A nice additional feature for the editor would be an option to replace all domains with "example.com" so one can publish it without moderation.
d) The "Admin Plugin" should show all messages and registrations with external links or content for a specific time (eg last day, on Monday last three days).
I like a) as it does not require interaction. And d) would be a great option which one does not really need to use.
Unfortunately there's no easy way to do this automatically: spam words change with trends and fashion, and we can't always rely on the community to police itself (users have become jaded and, as much as they get annoyed, don't report as much as they should).
No matter what, it takes a small time investment to monitor and keep track of spam terms.
- You can add any identified words and URLs to a black list so that they can't be used in the community. You have to be careful not to restrict too much because some terms might still be relevant (i.e. if one of your customers is a "casino" and has that term in their name, like we have).
- I've setup Google Alerts to notify me when any black list words/urls are used in post on our community.
- Same for any code that can hide text, such as font size 0 or using inline css to make text hidden
- Similarly, perhaps site search can be used too: setup synonym lists for groups of spam words (gambling, drugs, counterfeit good) so that searching for any one term in that group will retrieve results for all indicated synonyms. While not automatic, performing site searches like this every 2 or 3 days will retrieve all indexed results at a faster rate and more complete than Google Alerts might offer.
I did attach the bash/shell script we use to detect spam. One needs to adjust the URL, the white list and likely the output directory. So one can use a browser to review and read the messages with external links.
Drop Report Spam threshold to 1 vs. 3
While it is great for spam removal an user can abuse it to hide threads.
spam4sbs.sh 2.0 KB
Re dropping the spam reporting threshold - on JC it has made it sooo much easier to report - I know how complicated it used to be when the threshold was 3.
Re users it would be interesting to know approximately how many abuse reports are rejected on JC? presumably a persistent reporting abuser would be warned then disabled in the same way we treat a spammer.
There seems to be much less spam on JC nowadays - presumably because Ryan has setup interceptors or is just doing it manually
I did set the Keyword Interceptor / Moderation Query String to ``"http" OR "https"´´ today. Internal links still work fine, (all? - I made only simple tests) external links require moderation.
It's not perfect as it affects also our regular users.
Check the external bookmarks Ryan, there's a lot of spam.
I blocked .../bookmarks from crawlers...it will still be there but hopefully ignored.