Content on regulated mental health in social networks

We can find to €œour people€ in the social networks, but not if the content of mental health too much is regulated.

Instagram is hiding publications for #depresion for €œprotecting its community of the content that can foment a behavior that can cause damage or to even take to the death€. Mark Brown evaluates the consequences nonwished of the policies of emergent social networks.

During the last average decade and, the sites of social networks as Twitter and Facebook have been forms in which the people with problems of mental health have extended their social circles, have known other people with similar experiences and have found a new collective voice. The new proposals for the regulation of the social networks in the European Union as well as in America, run the risk of destroying this space where the people speak frankly, grace, humor and kindness without the necessity to life hide the shadiest and disturbing aspects with problems of mental health.

€”
What causes that the content of the social networks is so valuable for the people who really live with problems of mental health can be what she does so disturbing for other people who do not live with the same challenges. If €œdepression€ cannot look for the term, how it would find other people who have similar experiences?
€”

In the United Kingdom, the Matt Hancock, Secretary of State of Health and Social Attendance, wrote to the social means companies, noticing to them that they eliminate the suicide and the material related to the autolesiones. €œThe easy thing is frightful that still it is to accede in line to this content€, wrote, €œand I especially do not have any doubt on the damage that this material can cause, for the young people. It is hour of which the suppliers of Internet and the social networks are put to bleed this content once and for all €œ.

In April, the government of the United Kingdom in line published White Libro on damages, a document of policies, that establishes plans to reduce the potential of the social means platforms to cause ampler social damages. He proposes many sensible changes that would avoid the use of the social networks to promote the disinformation, the criminal activity and the harassment. He defines €œto encourage or to help to the suicide€ as a damage with a clear definition and €œdefense of autolesiones€ as a damage with a €œless clear definition€. The white book proposes a new regulating scheme; to have of well-taken care of to the users; €œCivil fines by faults proven in circumstances clearly defined€ and the possibility of responsibility of the high direction that €œcould imply personal responsibility by civil fines, or even could extend to the criminal responsibility€. The important thing to consider is that the proposals in White Libro try to limit the propagation of autolesiones and the suicide, to promote the content, not to help the individual to create it or to share it.

Looking for solidarity or support.

One of the transforming elements of the social networks for the people who experiment difficulties of mental health has been the possibility of finding other people who have had similar experiences. So that this happens, the content must be shared in public. What causes that the content of the social networks is so valuable for the people who really live with problems of mental health can be what she does so disturbing for other people who do not live with the same challenges. What we discuss with our pairs in the social networks and how we discussed it depends to a great extent on the context for its meaning. The appeal of a person to obtain support or to share the present feelings can be the promotion of other little healthful ideas or the stimulus to even talk back them.

He is tempting to imagine a full room of people reading, listening to or listening to everything what appears in the social networks. Every day, the platforms of social networks as Twitter, Instagram, YouTube or Facebook they publish more containing of the one than a person could review in a life or hundreds of lives. As individuals, our answer to any material in line is guided by our knowledge of the intentions of the content of the publication and our capacity to read them. For the platforms of social networks that wish to maximize the users and to obtain gains, and to avoid the processing and the fines, everything only it is contained.

The great volume of published material means that the social means platforms have a limited number of options to moderate the content. Shade and scale are not mixed. The repression of the probably prescribed content will be combined reinforcing the measures of information of the users at the same time as structural or technical changes are realised.

The structural change greater and faster than can make a platform is to change the terms of the service, as the inclusion of a clause that says €œthe users cannot promote or who consider themselves that they promote the suicide or the autolesi³n€; justifying any later action that the platform takes in answer. The structural changes can also imply to alter the form in that a platform or the forms works in which responds to certain forms of content. The platforms for small children sometimes do impossible to write certain words in the messages. Other platforms have experimented with the blockade of external liaisons to certain domains.

Facebook has taken to end many experiments to alter the content of its source of the news that really shows the users, especially in 2014 when it revealed that there was experimenting to see if what showed the users changed its emotions. YouTube at the moment faces critics by the form in which their recommendations suggest videos of small children to users who been seeing have contained of thematic sexual. In other places, Instagram is €œhiding€ publications labeled as #depresion. The search of this hashtag shows the message €œwe have hidden publications for #depresion to protect our community of the content that can foment behaviors that can cause damages or to even cause the death€, followed of a connection €œTo obtain attendance€.

Other solutions available for the platforms of social networks are based on mechanisms for €œdetecting€ the material that is soon considered problematic and to take measures to eliminate it or the accounts that have published it or shared. The detection based on words is only very difficult to correct. In 2014, the Samaritans application Radar in particular used searches of words in tweets of a user who followed in Twitter soon to notify the user who takes part and offers attendance. It was retired after a month after the protests on the privacy and the monitoring, many of them of people with problems of mental health.

At present, Twitter offers the option to report tweet individual as €œabusive or harmful€. A later page allows the users to select to the option €œThis person is encouraging or contemplating to the suicide or the autolesi³n€. Then, the users specify that the person who sent tweet is €œpotentially in danger€, to who soon is sent an automatically generated e-mail to him and he directs them to the Samaritan ones. A person can receive many e-mails of this type if they consider themselves in risk to the eyes of others. In 2017, Facebook began the international launching of the moderation of the content attended by AI to detect the suicidal ideaci³n. This brand the suspicious content for the action of the human moderators. Outside the EU, this it is possible to be climbed until calling to authorities as the police.

The social means platforms contain public material that goes back to years. The best guide than we have envelope how it could on a large scale be an answer to a great amount of historical content in a platform of social networks, it was the answer of Tumblr, it promoted a change in the legislation of the EE. UU. In March of 2018, the Congress of the EE. UU. It in line approved the Law of fight against the sexual traffic and the Law of halting of the sexual traffic in line (FOSTA/SESTA), a statute that grants more powers to the public prosecutors to fight the sexual traffic. This act did to the platforms of social networks legally responsible for the actions of its users. In agreement with magazine WIRED, Tumblr, a place for much content related to sex, acted using AI to mark and to eliminate €œphotos, videos or GIF that of life show to the human genitals real or feminine nipples, and any content, including photos, videos, GIFs and illustrations, that represent sexual acts€. This took to many incorrect decisions: €œThe classic paintings of Jesus Christ were marked, like the photos and the GIF of completely dressed, clear people for footwear, drawings of lines of landscapes, discussions on subjects LGBTQ + and more €œ.

The greater risk for the people with difficulties of mental health is to feel forced to return to the shades from fear of not fulfilling the rules exceeds what we can say in public. Although the platforms of social networks feel like as spaces public, they are private organizations that depend on the use of the public for their income. At present, it seems that the great platforms are obstinate to still more limit the freedom of the people with difficulties of mental health to discuss and to share, but this is part of the same position that refuses to implement additional controls exceeds what the people publish and share.

The danger of the proposals in White Libro on damages in line is that the companies can take decisions that they erode or they eliminate the confidence of the people with difficulties of mental health to share with freedom, and the capacity search and to find to others with similar experiences. If €œdepression€ cannot look for the term, how it would find other people who have similar experiences? The people with difficulties of mental health or are in a position in which their publications can be hidden, informed or marked by other people who feel uncomfortable reason why they share or they reveal. We can understand the shades when we see individual publications. Any solution that seems to deal with million publications will make no sense or will be expensive. The easiest answer to this material is to move away it and to hide it of possible the fastest and efficient way. This means to prohibit the users who publish content that is considered unacceptable or to eliminate the content in himself. A solution on very intelligent scale even has the potential of being very stupid actually.

Given sufficient a legal threat of responsibility, it is possible that a €œpurge€ as Matt Hancock suggests it can implement itself. The people who live day to day with problems of mental health are not a powerful lobby and, ironically, often they have been the social networks where our voices have been stronger and more authentic. Where the proposals can see risks for a repression, and where we have found opportunities.

 

It lets a commentary

Your email address will not be published. The obligatory fields are noticeable with *