Social networks offer so much information potential about the communicating users that the community portals can quickly be seen as a platform of customer data. First and foremost here is Facebook. Since this network offers the most potential through its size, it also promises the greatest added value [1].
Social media monitoring should exploit the potential in this area and, with a continuous analysis of the information, generate data that should be used for statistics, forecasts and / or recommendations for action.
Based on the generated customer statistics, strategy adjustments such as image campaigns or market analyzes can be made.
A company can use social media monitoring to learn a lot about its customer base and potential customers and to react actively to events.
The problem with social media monitoring, however, is the practice and the resulting implementation. A social network contains privacy policies that protect private (but relevant) information from customers. Further problems are the actual access to the information, the subsequent acquisition of the data and the preparation of company-relevant information.
However, the added value of social media monitoring clearly shows that these are issues that are worthwhile to solve.
Every social network and thus Facebook should offer a restriction of the profile settings that are visible to strangers. Although Facebook has the sole right to assure the rights of use of users [2], so the individual users can restrict the view settings of their profile of Facebook. The restriction prevents content from being retrieved through social media monitoring.
If the information is limited but available for some user groups (friends, family, public, ...), the data can be generated with appropriate steps.
There are several verification procedures for data protection [3].
The Facebook login plug-in is integrated into the developer toolkits, so that all applications that deal with Facebook can use a log-in. The log-in is essential for using the Open Graph, because an action taken must always be assigned to the user who is executing it.
The user must accept an authorization to allow the requesting third party to access the private data. This authorization is issued with a notification and a confirmation of the same. Participants in the authorization notification must be the client, their own servers and the Facebook servers.
Authorization is mandatory if Facebook log-in is provided on third-party sites.
If a user logs in via Facebook log-in and has therefore granted authorization for their data, an application (in most cases) needs access rights management. For example, the application may post places, images, game progress, or videos on the profile, or limit the insight to the base information of the profile.
Lastly, it needs an access token. This is a kind of key that is generated as a string after authorization. As soon as an application wants to access the Graph API, the access token must be presented in practice. The request of the application is executed only if the access token contains the correct access rights.
For a compliant use of the Open Graph, a series of steps is necessary, which, for example, address the user rights of the active users, but also ensure controlled application administration.
In use, it is possible to link several access tokens in order to achieve the respective rights. Once generated, however, an access token is not forever valid, so new tokens must be generated.
The API Graph of Facebook contains implemented structures which are made accessible with an access token. This prevents unknown users who have no authorization and therefore did not accept the authorization from having any interest in the content.
An addition to the topic of permissions is still missing. Even if the authorization has been granted and the access rights for certain information or action have been authorized, it may be the case that the profile itself did not release this information. An application is unable to turn off the security settings of a profile.
On Facebook, the initializations of third-party links are initialized using an HTML extension of meta tags.
Each post on, for example, a pin board can be provided with a meta tag.
This tag determines from which application / device / website the post was made, but also which object it is.
A meta tag sets specific criteria for a post so that a random viewer can retrieve advanced information about it. Practically, this procedure is explained by a product description on a product. The product itself is Existent, but it can not be assigned without a description.
The post created contains only the plain text created by the user. By adding meta tags, the post becomes a complete object in the open graph and contains extended information regarding the content.
Since the Open Graph is accessible via HTTP, objects and thereby information can be subtracted from the graph.
For the withdrawal of the data there are several possibilities to access the database of Facebook. Facebook itself provides four interfaces to access its own database [4]. The use of the SQL language Facebook Query Language implemented here specifically for Facebook is the most advantageous method because it can access a special selection [5].
Another advantage of FQL is that nesting of calls is possible and thus results can be re-searched with the results. To access the Facebook database, FQL uses Facebook's Graph API. In various tables, individual attributes can be selected, which are then filtered out of the database and output via the Graph API. Each database table contains specific attributes for each database category. Facebook is constantly refining the detail and structuring of the FQL tables to ensure that search queries are optimized. A big advantage of FQL is the accuracy of the query. It is thus possible to connect several requests more concurrently and to request the required data at the same time. Currently, 77 database tables are ready for access on Facebook [7]. After sending the search query, FQL outputs the response results in a Java Script Object Notation Array, which can then be read out.
If a query of data is used professionally, it is imperative to optimize the range of performance. In a product comparison of several 100 or even 1000 requests to the Facebook server crashes the system quickly or takes too long to update the data. Facebook Inc. itself has recognized this problem and offers a solution alternative.
More and more frequently, the FQL method is considered to be the most efficient, since its multiple request request saves performance in just one statement8. In addition, Facebook Inc. has introduced the Batch Request.
A batch request is a logical HTTP request sequence, which is displayed in a JSON array. The batch query contains up to 50 requests to the Facebook server and can perform several actions simultaneously using the methods (GET / PUT / POST / DELETE) 10.
It is thus possible to simplify large queries using Batch Request. Instead of multiple HTTP queries, a batch request is executed that queries the information in parallel, thereby saving time. The time saved is of great importance for company-relevant statistics and should therefore not be ignored.
With regard to the problem of social media monitoring, the implementation in terms of data collection can already be expressed in words.
A combination of FQL and a batch request ensures performance optimization by combining runtime optimizations with batch requests and more efficient filtering using FQL.
If PHP is used for this data acquisition, a parser can retrieve the obtained JSON object after retrieving the data and using its array to filter the data11.
This data could be exported to an Excel spreadsheet and then compared.
However, to gather statistics, it is important to save the object ID (object_id) or the unique URL of the objects to be viewed in a corporate database so that you do not have to manually search for the object.
By applying social media monitoring correctly, companies can use and exploit many important potentials of communication portals.
The great thing about social media monitoring is that it never stops and data can be constantly pulled. In the case of a once implemented structure, the data need not go through much effort before it can be deducted an added value for the company in the form of, for example, image campaigns.
So the data comes practically by itself and it only has to be accessed.
The persistence of the data analysis makes it possible to predict that good social media monitoring can provide lasting improvements in terms of customers and thus purchasing power. This is also the reason why social media monitoring is currently very heavily used and developed.
With these words I wish a good monitoring!
All references were retrieved in the period from 24.01.2013 to 28.01.2013.
Referenced sources:
1 Facebook Newsroom; Facebook Newsroom
2 Voting of User Rights: Time Online
3 authorization steps: Facebook Developer
4 Facebook API; Facebook Developer
5,7,8 Facebook Query Language; abouttheweb blog
6 Facebook API Graph Explorer; Facebook Developer
9 Batch Request; Allfacebook Blog
10 Facebook Batch Request; Facebook Developer
11 Read out JSON object; Dr. Web Blog
Comments