Review of 2018 – The hottest topics about legal aspects of the use of the Internet and social media sites

Maybe it is not an exaggeration to say that 2018 was the most significant year in the history of the neither with you nor without you connection of the Internet and law. And it is not an excessive prediction that 2019 will be more intense. At the end of the year I reviewed the most important news of 2018 about the subject of this website. On the top of the list, there must be definitely the Cambridge Analytica case which I am planning to expound in an individual post. However this case had an outstanding affect on the public judgement of the Facebook, the historical hacker attacks on the site could also unsure the users of the most popular social media site if their personal data are in good hands at Mr. Zuckerberg and co. In the following sections I would like to introduce these news, without aiming to give a complete picture about every single legal incident which happened on the world wide web. I appreciate any completion in the comment section. The list is not in chronological or orderly order, the only organizing principle is to present as many aspects as possible.


(Source: Getty Images Hungary)

Facebook friends are not real friends:

The Supreme Court of the state of Florida ruled that Facebook-friendships are not true.On 15 November, the Governing Council expressed its views on the Facebook knowledge of the judges and lawyers: the court considers that it will not raise the bias of the judge if it adjudains on a case where the lawyer’s friend is on Facebook.The resolution responded to an appeal in which one side asked for the exclusion of the judge because of the Facebook friendship that assumed such a bias. According to the court, however, the traditional, offline friendship would definitely be sufficient reason to rule out the judge because the nature of friendship is ‘ not defined ‘.

The majority opinion states: The establishment of a Facebook “friendship” does not objectively signal the existence of the affection and esteem involved in a traditional “friendship.” Today it is commonly understood that Facebook “friendship” exists on an even broader spectrum than traditional “friendship.” Traditional “friendship” varies in degree from greatest intimacy to casual acquaintance; Facebook “friendship” varies in degree from greatest intimacy to “virtual stranger” or “complete stranger.”

The legal question in this decision does not seem significant, but it is interesting to see a very accurate statement about the impersonal online relationships.

Anti-fake news law in France:

In November 2018 the French National Assembly adopted two legislations aimed at preventing information manipulation, i.e. the dissemination of the rumors in the electoral period.

Under the law, social media websites should be transparent when they publish content that is paid. The candidates departing from the elections will have to bring to court the prevention of the distribution of the rumors in the three months preceding the election.

According to the law, the French Media Authority may, during the electoral period, even suspend the broadcasting rights of televisions under the control of, or under the influence of a foreign State which have deliberately disseminated rumors that have influenced the choice.

President Emmanuel Macron announced earlier this year that he intends to review French media legislation. Macron was clearly referring to the Russian state media, the RT news television and the Sputnik news site which was classified as propaganda sites.

The law has got a lot of attacks from political competitors and lawyers as the French government has been accused by both right and left wing opponents of trying to create a form of “thought police.”

In April 2018 the legislative power of Malaysia has also passed a bill on fake news. The law defines fake news quite strictly as “any news, information, data and reports, which is or are wholly or partly false, whether in the form of features, visuals or audio recordings or in any other form capable of suggesting words or ideas”.

Facebook as a tool in genocide

According to an article released in The New York Times in October 2018, for many years, the Myanmar Army has maintained a special corps for inducing anger and hate against the country’s Muslim minority, spreading rumors on Facebook. This is the first example of the fact that a dictatorial power attacks its own population against the rumors, trolls, and propaganda distributed on Facebook. According to the sources of Myanmar who have requested to be heard, the operation has involved hundreds of troops, who had to operate trolls, fans of fansites, fan pages, to flood Facebook with hateful content and commentary. That 18 million Internet users in Myanmar identify with the web itself. In other words, it is mostly about Facebook.

Facebook confirmed many of the details about the shadowy, military-driven campaign. The company’s head of cybersecurity policy, Nathaniel Gleicher, said it had found “clear and deliberate attempts to covertly spread propaganda that were directly linked to the Myanmar military.”

In August, after months of reports about anti-Rohingya propaganda on Facebook, the social media company acknowledged that it had been too slow to act in Myanmar. By then, more than 700,000 Rohingya had fled the country in a year, in what United Nations officials called “a textbook example of ethnic cleansing.” The company has said it is bolstering its efforts to stop such abuses.

Fake viewing data disclosed by Facebook

Another unpleasant case in connection with the most popular social media site. Mark Zuckerberg’s empire has been accused that they knowingly misled some of his partners for month about the viewing data of videos posted in Facebook.

The problem was that the video viewing data system reported erroneous numbers. Zuckerbergs also noticed and corrected it, but according to the claimants (because they were suing the company at the same time), Facebook became aware of the problem sooner, months before correcting the calculation, but it was only restored after more than a year. According to the media companies launching the case, Facebook allowed it to display its own platform (much) in a better shape to its potential partners than Youtube (Google).According to Facebook’s calculations, the data displayed due to the error showed a difference of 60-80%, as they were otherwise, but they stressed that their partners did not have any damage. Other professional sources stated that it was about an 150-900 percent error, and the firm has known the problem from the very beginning.This is also a problem, because such a difference can be enough for a brand to be the largest video distributor, that is, instead of the YouTube service, choose the Facebook solution to introduce a product to the community. Therefore, the applicants accused Facebook of unfair business conduct and fraud. Data on Facebook video views has since been audited by a company that has been invited to do so, not an in-house system.

A mobile app to prevent suicide

A new method for working with depressed patients with a mobile phone has been developed. The method seems to have a good chance one day ahead if, based on the data, the patient may be suicidal, although Matthew Nock, Professor of Psychology at Cambridge University, has not yet disclosed the exact details of this research. An article on introduced the new application.

For the development of the program, Nock collected and analyzed the data of hundreds of suicide patients, and hundreds of university students participated in the program, and were monitored for weeks by a sensor developed by Rosalind Picard an electrical engineer and computer scientist at the Massachusetts Institute of Technology (MIT) in Cambridge.
Based on the data, it seems that not only can the new device do a good job in discovering mental illnesses, but it can be predicted one day in advance if the wearer is in a state of deterioration and can thus be of invaluable help in preventing suicide. Picard began to develop the sensor for a completely different purpose, but it had to do with health care: in 2013, an observation tool was started to predict spasm attacks, from where the idea that the engineer, who had previously been depressed himself, came to this problem. You can also create a monitoring tool.

A lot of applications for similar purposes have been made before. There are more than ten thousand similar applications, so it is not surprising that there is a big difference between them: the more serious ones are those specially developed for veteran soldiers: Operation Reach Out, for example, was made for former soldiers suffering from depressive and post-traumatic stress syndrome.

A 13-year career ends because of Facebook posts

An instructive legal case from Hungary which still has not ended yet with a verdict by court. A prosecutor with a 13-year career and a number of professional awards, received three phone calls on the day after the parliamentary election in Hungary in April 2018 for his pre-election Facebook post to be disciplined. Finally his employer ended his contract. The decision was challenged in court by the prosecutor. The subject of the lawsuit is, among other things, whether the posts stretch the frame of freedom of speech, but most of all, how proportionate this punishment was.

The prosecutor posted a picture and a text on his personal Facebook profile and later he shared an article with his comments before the election. The picture was critical against the winning party of the election and their supporters and depicted a sheep flock. The text encouraged his Facebook friends to join their forces irrespective of party affiliation against the ruling party during the election. The shared article was about a growing scandal of a member of the governing party, with comments from the prosecutor which suggested that the person in question is guilty in connection with the statements of the article.

The procedure is still in progress at the Capital Labour Court in Budapest.

First fine for Facebook for the Cambridge Analytica case

The Cambridge Analytica case definitely deserves its own topic and I am planning to do it in the near future as it is my the subject of my latest publication. In this article I would like to introduce the first real legal reaction to the case which changed a lot of aspects of how we think about social media in 2018.

Facebook received a fine of 500.000 pounds in Britain for having released users’ data to third parties without the user’s consent. This was the so-called Cambridge Analytica scandal, so they called the company that acquired the user data. Cambridge Analytica also contributed to Donald Trump’s campaign, making Trump’s campaign messages easier to design and target on Facebook for the right audience. The 500,000 pound is the highest fine allowed by the old UK privacy policy before the entry into force of the GDPR. According to the British Data Protection Authority, Cambridge Analytica could obtain data from over 1 million British citizens without the consent of an external application developer.

It is Facebooks’ luck that the GDPR was not applicable at the time of the infringement, because it is clear that the fine after the scandal can be close to the highest which the GDPR allows 20.000.000 euros.