Keywords

1 Introduction

In recent years, there has been a lot of news about privacy and security breaches on different online social network (OSN) platforms. The most notorious being the case of Cambridge Analytica, which affected millions of Facebook users whose data was being harvested without their consent and used for political purposes. This led to a lack of trust among the users, while others have decided to leave the platform altogether and/or move to an alternative platform [1].

Previous research has shown that people find it important to have control over who sees their personal information on Facebook but are more likely to disclose information on Facebook than in general [2]. Higher levels of perceived privacy and control over information flow increase trust in Facebook [3]. In addition, higher trust in Facebook leads to more information sharing, whereas awareness of the possible consequences of sharing information on Facebook leads to more protective self-disclosure behavior [4].

Facebook is a social media platform used by 79% of Estonians [5] but in recent years Facebook has faced a lot of criticism over how they collect and handle users’ data. This led to Facebook admitting failure in providing secure data storage and promising to make changes in how they collect and utilize user data [6, 7]. These privacy concerns affect the trust of Facebook users in a negative way, however, trust is essential for any online social network to function. Decreased trust in Facebook has resulted in movements like #DeleteFacebook and giving competitive social media platforms a chance to gain new users [8].

The purpose of this study is to understand Estonian Facebook users’ usage behavior and perception of trust in Facebook, and based on that design a trust reassurance mechanism as a proof of concept. The results of the study confirm that trust in Facebook among Estonians is very low and they would like to have more control over their data. Experts designed a solution that would provide Facebook users with tools to control and manage their personal information. The users would have the opportunity to obtain more information about what data is being collected about them, how is it used and who has access to it.

The remainder of this paper is organized as follows. We first discuss the related work on privacy, control over data and trust in Sect. 2. Next, we will give an overview of the research methodology and results in Sect. 3. Finally, we summarize our findings and conclude in Sect. 4.

2 Privacy, Control and Trust

Privacy is individual’s right to determine how, when, and to what extent information about them will be disclosed to another person or to an organization [9]. As we are creating more data than ever before [10], privacy and data protection have become more and more important. With the increased amount of data comes the risk of malevolent attacks. Since 2005, there have been 9071 data breaches that have been made public [11] but the amount can be several times higher. The scale of the breaches varies greatly, as both individuals and major corporations and their vast pools of data can be targeted.

The number of breach incidents in social media in 2018 was six, however, the number of records breached was 2,56 billion which accounted for more than a half of all the records breached in the first half of 2018 [12]. While the number of incidents has declined, the amount of records that get compromised has increased. A few examples of largest social media privacy breaches:

  • In 2012, 171 million VKontakte users’ accounts were obtained by a hacker [13]

  • In 2018, personal profile data of 52,5 million Google+ users was exposed [14]

  • In 2018, two major scandals hit Facebook: in the first half of the year up to 87 million users’ personal data was shared with Cambridge Analytica [15] and in the second half of the year, hackers gained access to 14 million users’ highly sensitive data that could be used to facilitate identity theft [16].

2.1 Control over Information

To be able to enact one’s privacy preferences, the user needs to have control over who can access the information they have disclosed. Recent research has shown that if given the choice, users prefer to share less information than more [17] and this decision reflects on users privacy concerns. Some research suggests that rather than having privacy defined through the disclosure of certain types of information, it should focus on having control over who knows what about the user [18] and having “personal control over the collection, use, and disclosure of one’s personally identifiable info” [19].

As personal information disclosed on social networks can reveal a lot about users’ identities, the degree of control over that information should be as high as possible. For that, systems should support the user’s ability to control, edit, manage, and delete information about them and decide when, how, and to what extent that information is communicated to others [20]. Control over the access of personal data allows users to “stipulate the information they project and who can get hold of it” [21].

The more the user perceives to have control over who has access to their information, the less concerned they are about privacy [22], therefore, to lessen the privacy concerns, more control functions should be offered to users - both for the disclosed information and who has access to it. However, privacy related behaviors are more likely to be influenced by users’ concerns about the amount of information being accessed rather than disclosing it themselves [22].

In the case of Facebook, right now users can control who can see their disclosed information regarding other users, however, regarding third party applications on Facebook, the options are more limited. Some applications require information about the user (e.g. email address, date of birth, name and profile picture) to be accessible, which might lead to lower perceived control. Simply informing the users about what information is being shared with third party applications without changing the amount of disclosed information can increase the level of perceived control [23].

2.2 Trust

Trust in platforms depends on factors such as users’ personality, knowledge based on users’ prior experiences, institutional assurances from providers, calculative assurances from providers, and cognitive assurances from third parties [24, 25]. Effective privacy practices are considered essential to create and maintain users’ trust [19]. A common way to positively influence users’ trust is enforcing privacy policies which demonstrate the users that a platform is competent and has the will to secure any disclosed personal information. The more users trust the platform the more loyal they are by “increased purchases, openness to trying new products, and willingness to participate in programs that use additional personal information” [26].

According to Mayer [27], the extent to which an organisation, in this case Facebook, is trusted depends on perceived trustworthiness, which consists of three factors: ability, benevolence, and integrity. Ability means that the organisation or service provider has the domain specific skills and knowledge to be taken seriously. Benevolence is the degree to which a platform is concerned about the users and not just profits. Integrity is subjective - if and to what extent the platform abides by ethical and moral principles.

After the Cambridge Analytica scandal, trust in Facebook dropped 66% with only 28% of users believing that Facebook is committed to protecting the privacy of users’ personal information [1]. The reason for lack of trust - users are concerned if and how their personal information is being kept private. For a user to trust a platform, there has to be motivational incentive which can be affected by propensity to trust, perception of risk, benefits of engaging in the relationship and the availability of other options that may achieve similar results [28].

Without trust, users won’t interact with each other, and will stop disclosing and diffusing information that is essential for any social network to operate. To trust a certain OSN, users need to trust other users of that platform [29]. As the profiles of users should be attributed to people known in the real world, they are attributed the same value of trust as the owner of the profile [30]. To facilitate and maintain trust, OSNs need to have specific processes to not only verify that the person behind a profile is that person but also make sure that once the profile is verified, the profile will be used by that person and not someone else.

3 Research Methodology and Results

This research is composed of two parts: a survey and design session. A survey was designed based on previous research done on user behavior on online social networks, trust in technology and the methods used to measure it. For the concept design session, six experts were invited to help design an initial concept of what could be a “trust foster mechanism” that would reassure both control and trust when using online social networks. Those experts had a combined expertise in user experience, user interface and graphic design, web and application development.

3.1 Survey

A total of 198 responses were obtained and used for data analysis except when comparing attitudes regarding gender - one of the respondents preferred not to reveal their gender. Out of 198 respondents, 154 were women and 43 were men. Data was collected about the following: demographics, usage patterns, information disclosure behavior, trust in Facebook (including risk perception, benevolence, competence and general trust), awareness about data breaches, patterns and behavior change compared to the previous year and finally, privacy preferences.

The survey showed that the most popular activities on Facebook are communicating with people, visiting Facebook groups and reading the news. 71.7% out of all respondents communicate with people very often, 31.3% of respondents report to visit Facebook groups often and 29.8% of respondents read the news often. After the Cambridge Analytica scandal, most of the respondents reported to be less active on Facebook - for those, who didn’t say the scandal directly affected their Facebook usage, 62,9% said they were less active than the previous year and for those, whose usage was affected 61,5% were less active. Surprisingly, 18% of the respondents whose usage was affected, are now more active on Facebook.

Looking at information disclosure, men are more inclined to disclose information with 46,5% of male respondents having high disclosure levels. Women, on the other hand, are more private with their information with 45,8% of them disclosing little information. Overall trust perception towards Facebook among respondents was rather low - 86,4% of respondents have either very poor or below par trust in Facebook. Over half of the respondents had very poor trust perception towards Facebook, regardless of their usage patterns or privacy and disclosure behavior.

The higher the levels of trust, the more people spend time on Facebook. 8,3% of the people who spend more than 4 h a day on Facebook perceive good levels of trust in the platform. Only 2,9% of the ones who spend less than an hour there, have the same level of trust in Facebook. Same tendencies apply to using Facebook’s different possibilities, functionalities and features. People with low or medium activity have lower levels of trust compared to the ones with high activity.

3.2 Concept Design

The aim of this research step was to develop a paper prototype as a proof of concept by a group of design experts. To achieve this, the data gathered from the survey about Estonians’ Facebook usage was combined with literature review results and presented to experts. This way, the experts had sufficient information to understand the character of the user in more detail. The design session consisted of five stages: debriefing the survey results, defining user needs and articulating problem statement, drafting the design concept vision, agreeing on a specific vision that reassures users’ trust and finally, paper prototyping to consolidate the problem solutions.

One bottleneck brought out by the experts was the public image and general reputation of Facebook. The fact that Facebook has been sharing user data without their consent is acknowledged by many and people are more aware of the risks. To rebuild trust, Facebook should be more transparent in different processes - e.g. design, engineering and finances. One aspect of trust is reciprocity - users should gain something extra besides features for sharing their data. Another important topic is general trustworthiness of Facebook and its content. There is a need for users to be reassured that the information they see on Facebook is not dubious, and if something can be considered ambiguous it will not be spreaded.

Combining different possibilities that arose from the design concept drafting stage, the experts decided that a dashboard presenting information disclosed by the user and data collected about user behavior would be the best solution. Most prevailing concept was to visualise as much as possible in a simple way so that the user would have some overview by just seeing the dashboard. One way to do this is by using different data visualization techniques to draw attention to the most basic information which is data collected and shared with third party service providers. The user should also be provided with different filters to break it down - period, type and parties that have access to the information.

In addition to visualizing the data, the dashboard should include a short report about the information presented in the graphs and a possibility to further review the specifics of the shared data. This means specifying when and how was this data collected, who has access to it, when was the access granted, how is this data being used by anyone who has access to it, etc.. For the users to comprehend the vast amounts of data, it should be categorised, giving the user easy access to specific data they are interested in. It is important not to overwhelm the user, on the contrary, provide them with a simple comprehensive overview of the data they want to review.

4 Conclusion

In the past seven years, six big social networks have encountered massive privacy breaches which have an impact on users’ perception of trust and also their usage. The more control users have over their information and data, the more they can trust the social network. Trust, in turn, determines whether or not users will remain loyal to a social network and will keep using it. There are several frameworks that help designers to create services that are perceived trustworthy, but Facebook has faced criticism about their design approaches. This can be solved by designing ethically, meaning that at all times the user is kept in mind.

The results of the study confirm that trust in Facebook among Estonians is very low and they would like to have more control over their data. Design experts designed a solution that would provide Facebook users with tools to control and manage their personal information. The users would have the opportunity to obtain more information about which data is being collected about them, how is it used and who has access to it. It gives users control over their information which will ultimately increase the level of trust in the platform. Knowing how users perceive trust and its effect on their usage behavior, designers can design platforms that are trustworthy and hence, user friendly.