The Internet giant also reported on new tools to make life easier for community managers on the social network.
Groups on Facebook are one of the most important social network tools, especially for that reason: they create societies that allow the interaction of a group of people based on their tastes or interests and whose objective is to strengthen their relationships within virtuality.
In short, groups give Facebook true "social network" status. For this reason, for this 'Internet giant' it is of utmost importance to maintain the optimal development of this tool and, of course, to strengthen those 'failures' that today could be considered problematic in its operation.
In favor of this, Facebook announced a series of changes that will support community creators, better known as "administrators," so that they are the ones who have control of the groups and can have all the options to mediate the better shape in their respective communities.
New home page for administrators
Just as in the past, different types of 'feed' or home pages have been created for other tools (such as Business Suite for Fan Pages ), Facebook decided to create a new home page where group administrators can find functions and configurations specially designed to meet the needs of your status within your community.
In this way, with this new experience, administrators will be able to observe and list in an organized way those topics that could be classified as urgent within their group: number of members, new participants, new publications, requested publications, and comments that, when considered as illegal within of community standards have been reported.
In addition, this 'feed' has a new system in which organization and clarity are two components to highlight.
Moderator of conflicts in the group
One of the biggest problems every administrator faces is how the delicate line that divides peace from conflict is easily broken in some publications.
Although it is assumed that all members of a virtual community are part of it in search of people who share their tastes or interests, this does not mean that every one of them thinks similarly and exactly about every topic in life.
For example, in a group whose union is passionate about a specific soccer team, not all members have to think alike about their country's politics. They are not robots, and, as expected, each one is a different universe.
Given this, it is logical that an administrator cannot control, or even try, that all the members of his group think in the same way; However, you can intervene to avoid friction between your' group mates' as much as possible and end any discussion that is generated in the comments box of the publications at the root.
For this, Conflict Alerts was created, a tool based on Artificial Intelligence that will develop a learning system to detect, correct, and end the presented discussions within a group.
According to Facebook, this new "moderator" is a function that "notifies administrators when there might be controversial or unhealthy conversations in the group, so they can take action on it ."
Facebook users can deceive this type of system using colloquial language or signs instead of letters. Still, it is precisely in this situation that Facebook is working so that in the future, the "virtual moderator" can take "obvious clues" and easily and quickly detect past conversations in any publication.