Social Media Means
Photo by Vlada Karpovich Pexels Logo Photo: Vlada Karpovich

Is Google is an example of social networking site?

Popular examples of social network sites include Facebook, Twitter, Google +, Tumblr, and LinkedIn, each of which provide similar functionality in terms of sharing content and connecting with other users but differ in their underlying objectives.

What race is fastest runner?
What race is fastest runner?

The 100-meter dash is beautiful in its simplicity. As a short-distance sprint, it is actually the shortest event in major competitions. World-class...

Read More »
Does rewatching a video count as a view?
Does rewatching a video count as a view?

Do replays count as views on YouTube? Yes, but only if the replays seem natural. If you replay a video once, it will count as a view. However, if...

Read More »

The web as a platform: web services, the cloud and the app

In What is Web 2.0 the first principle of Web 2.0 is the use of the web as a platform, emphasizing the value of providing services automatically through the web rather than focusing on the traditional desktop software or services that necessitate lengthy human negotiation (O’Reilly, 2005). There is now a host of software available through the web: from those that are so integrated in our experience of the web that we do not give them a second thought, to those that have been so embedded in our desktop experience (and have grown to such complexity) that web versions continue to feel like poor imitations. Probably the most important piece of software we use on the web is one we do not even think of as software: the search engine. While it is so integrated with our online experience as to be for some people synonymous with browsing, a consideration of the alternative of a search engine application running on a person’s personal computer (PC) quickly demonstrates some of the advantages of using the web as a platform. In simple terms, a search engine can be thought of in three parts: the robot, the indexer and the ranking algorithm. The robot (also known as a web crawler or spider) is a program that downloads pages from the web in an iterative fashion. Starting with a list of seed URLs, a robot will download the web pages at the URLs and extract any URLs it finds in those web pages, which will then be downloaded in turn; a process that is repeated until all the web pages that are required have been downloaded. To create a search engine that is in any way equal to that of one of the major search engines would require the robot to download billions of pages, and with new pages being created all the time and many of the pages updated on a regular basis it would be an endless task. The index enables the search engine to match documents to a query without having to search through every document each time a query is entered. At the very minimum, an index is likely to include all the words in a document, although it may also include a host of other features, such as anchor text on those pages linking to a site, the type of page (e.g. .pdf, .rtf or .doc), or the creation date. While a more extensive index may help with the retrieval of more pertinent results, it will nonetheless take up more computer processing power. Finally, with the simplest of queries returning millions of hits, the results need to be ranked. While this may once have been based on the frequency or position of search terms on a page and how other websites link to a site, a search engine such as Google now uses over two hundred signals when ranking web pages (Google, 2011). The downloading and ranking of billions of web pages would take huge bandwidth and processing power, far beyond that which is available to the average user. Even if it were possible, it would be a huge waste of resources, as few of the pages would ever need to be discovered by the person who had downloaded and indexed them. Although there will be popular searches that many people use, these will be dwarfed by the long tail of niche searches that will only be used occasionally by a handful of people (Anderson, 2006). It is only efficient to index the web when there is a sufficiently large audience to make use of the index. The large audience can also provide useful feedback for a search engine, and help to improve the overall system. If, for example, on one particular search, people are regularly clicking on one particular link rather than another, the search engine can start to rank the selected item more highly; this is an example of harnessing collective intelligence, which will be discussed in the next section. A wide variety of traditional desktop software is now available online, although it has been met with various degrees of success; whereas email clients seem to have made a smooth transition from desktop application to web service, the transition to web-based versions of office software has gained less penetration. Many email users are now seemingly happy to switch between desktop and webmail clients as necessary to suit their needs, with the same account often allowing access both via a web interface or via a desktop client. While a desktop client may have additional functionality and allow the reading and drafting of responses without an Internet connection, web-based versions offer a convenience of accessibility that is not possible with desktop clients. A Pew survey in 2008 found that 56 per cent of those asked said they used webmail services such as Hotmail, Gmail or Yahoo, in comparison to 29 per cent who said they used online application programs such as Google Documents (a browser-based alternative to Microsoft Office) or Adobe Photoshop Express (an image editing web application) (Pew, 2008). The gap between the use of the two types of services is in fact likely to be much larger once the time spent using the applications is taken into account, in addition to the emphasis on webmail access over traditional email accounts: whereas many users are likely to solely use an online mail service, the use of Google Docs or Adobe Photoshop Express is more likely to supplement rather than replace existing desktop services. One of the reasons for the high penetration of web-based email services is likely to be the relative importance of accessibility over functionality, as most email responses are often simple text responses. In contrast, office documents will often make use of a far wider range of functionality, much of which will not yet be available in the online equivalent. While online versions are likely to see increased functionality in the future, it is not enough that they are merely online equivalents of traditional office applications. Revisiting the web as a platform in Web Squared: Web 2.0 Five Years On, Tim O’Reilly and John Battelle (2009) emphasize that it is the network as platform that is important. This means it is not just about offering existing desktop software through the browser, but about building applications that get better the more people use them, such as enabling documents to be accessed by multiple users at the same time and analysing the way the word processor is used so that the word processor can be improved. Although desktop services are being made available online, Web 2.0 and the web as a platform are more often thought of in association with many of the social media services that have emerged. Social media builds upon the ideas of Web 2.0 for the creation and exchange of user-generated content, and over the first decade of the twenty-first century, technologies such as blogs, wikis and social network sites have become increasingly well-established in a wide range of fields (Kaplan and Haenlein, 2010). Blogs (frequently updated websites displaying posts in reverse-chronological order) are one of the longest established social media technologies. The term was first applied to this genre of website in 1997, and the launch of free web-based software in 1999 saw an explosion in the number of blogs online (Blood, 2000). Web-based blogging software makes the process of updating blogs very simple, and their popularity has also seen the creation of many sub-genres of microblog, with sites like Twitter being used primarily for the sharing of text messages of 140 characters or less, and Tumblr for the quick sharing of any type of content from a browser or desktop. Importantly, a blog platform such as WordPress provides not only a simple piece of software for individuals to share their opinions with the world, but also a plug-in architecture that allows users to extend the functionality of a blog. Where many blogs or microblogs are being hosted by the same site, such as LiveJournal or Twitter, communities emerge that have more in common with social network sites.

How to earn money in social media?
How to earn money in social media?

How to Make Money on Social Media Earn Commissions Through Sponsored Posts. ... Get Paid for Reviewing Affiliate Products. ... Sell Your Own...

Read More »
Is Instagram automation legal?
Is Instagram automation legal?

Instagram bots are legal when they comply with the Instagram Platform Policy. Particular types of Instagram bots (like follow bots and like/comment...

Read More »

Social network sites have been defined as sites that enable users to create public profiles in a bounded system, to create connections with other users, and to navigate and view both their own as well as other users’ connections (Boyd and Ellison, 2007). While a site such as Twitter may be thought of as primarily a microblogging service, it can also be seen to adhere to each of these criteria and so may be thought of as a social network site. Since SixDegrees.com, the first recognizable social network site launched in 1997, many different social network sites have come and gone (ibid.). It has been suggested that social network sites can be categorized according to three main types: those for networking, those for socializing and those for navigation (Thelwall and Stuart, 2009). While a site such as LinkedIn may be seen as primarily for networking in a professional capacity with people who a user may or may not already know, a site such as Facebook is primarily used for socializing with people a user already knows. Navigation refers to those sites that, while having social network site characteristics, are primarily focused on providing access to content. For example, the photo-hosting site Flickr and the video-hosting site YouTube both have social networking capabilities, although they are primarily for accessing content, with the social network functionality providing a filtering facility. Whereas blogging can be seen primarily as an individual practice, social network sites enable users to engage more easily with one another and one another’s content, although both may encourage a fantasy of participation (Dean, 2008). Although users may believe their opinions are being listened to and are having an impact, in most cases they are either not being read or are being ignored. Whereas it was once publishing that was considered a claim to authority, now it is attention that is increasingly important. The increased use of certain social network sites in favour of the traditional blog has led to claims that blogging is dead or waning; rather than some inherent fault with blogging, however, it is a reflection of the emergence of more specialized tools, which may be more appropriate for particular situations.1 In September 2011 Nielsen reported that social network sites and blogs accounted for 23 per cent of the time Americans spent online (Nielsen, 2011). Since 2007, social network sites have attempted to offer increased functionality by providing application platforms on which external developers can build applications; these both provide a marketplace for application designers and provide users with access to functionality that social network sites would not have the time or money to develop themselves. The most popular of these applications have been downloaded tens of millions of times, and although most downloaded applications are often games (e.g. Cityville, a city-building simulation game), there are also more obviously useful applications, such as those providing additional communication functionality (e.g. Windows Live Messenger) or enabling the editing of office documents (e.g. Microsoft’s Docs). The increasing power that a small number of social network sites have over the way people interact online should, however, be cause for concern. Facebook, the largest social network site and currently reporting over 800 million active users, now has a significant amount of power over an increasingly important communication platform, with the potential to dictate the sort of content that people share (Facebook, 2011). Rather than the limits of freedom of speech being established by the courts and the rule of law, it is increasingly at the whim of a social network site or the tyranny of the masses. While the size of Facebook and the advantage resulting from the network effect make it seem unlikely that it will lose its market dominance in the near future, the rapid fall of MySpace is a reminder that no website is invulnerable. Once the most popular social network site, MySpace was bought by News International in 2005 for $580 million, but was sold in 2011 for a mere $35 million. It may be that Facebook is not overtaken by a single competitor, but rather by a set of open standards, as there is increasing interest in distributed approaches to social network sites that will prevent any one site achieving such market dominance in the future, and allow individuals and organizations to take control of their own data (Stuart, 2011a). The other type of social media site that has grown in popularity over the first decade of the twenty-first century, and that in one specific case has become infamous, is the wiki. Wikis enable the collaborative creation and editing of web pages through a web browser using either a simple markup language or text editor. The most famous of these, Wikipedia, provides both an example of what is possible through a wiki and its limitations. Many times the size of its rival encyclopaedias, and far more popular, its reliance on anyone contributing to the areas in which they are interested, and allowing anyone to contribute, has led to criticisms. It relies on a variation of Linus’s Law: ‘Given enough eyeballs, all bugs are shallow’, which was first applied to the development of open source software (and named after Linus Torvald, who started the development of the open source operating system, Linux) (Raymond, 1999). The expectation is that, given enough users, factual errors will be quickly spotted; this is not always the case, however, as a large number of users on a site does not mean a large number of users visiting every page equally. Most famously, the journalist John Seigenthaler’s Wikipedia biography was changed to falsely suggest that he had been linked with the assassinations of John and Robert Kennedy, and the information remained unchallenged for a number of months, with the resulting controversy leading to new guidelines for the biographies of living persons. Although a study by the journalNature shortly after the controversy erupted showed not dissimilar levels of accuracy in a comparison between Wikipedia and the Encyclopaedia Britannica, the science pages that were analysed are not necessarily the most contentious or the most likely to be vandalized (Giles, 2005). While Wikipedia is the most well-known wiki, used by 53 per cent of American adult Internet users (Zickhur and Rainie, 2011), there are important differences in people’s understanding of how the site works, the credibility they assign to the information they read and their willingness to follow up sources (Flanagin and Metzger, 2011; Menchen-Trevino and Hargittai, 2011). Used properly, however, wikis can provide an ideal platform for group collaboration, allowing the quick and simple creation and editing of web pages, and allowing versioning for edits to be rolled back. All of these Web 2.0 applications may be thought of as taking place in ‘the cloud’, a metaphor for the Internet, with ‘cloud computing’ generally used to refer to ‘storing, accessing, and sharing data, applications, and computer power in cyberspace’ (Anderson and Rainie, 2010). The cloud may be seen as the natural conclusion of the provision of services through the web, moving from the provision of software services to the provision of computing power itself. While cloud computing incorporates social software, it also takes the current generation of Web 2.0 services to the next level. Web 2.0 services as we generally think of them typically provide one particular service through a web browser, with users needing to go to different sites for different services; although Facebook now provides a platform in addition to its core social network service, it is far more limited than the services that could be available through the network as a platform. While cloud computing includes software as a service and data as a service, it can also include hardware as a service, and so could enable the virtualization of hardware (Wang et al., 2010). The era of the PC is synonymous with running applications on the desktop, as opposed to multiple users time-sharing first on mainframes and then on smaller mini-computers. While PCs originally offered convenience in comparison to the timesharing of limited computing resources, such an approach may be seen as extremely wasteful as IT infrastructure becomes increasingly complex. Organizations are spending an increasing amount of time and money on the IT infrastructure of an organization: between the end of the 1960s and the year 2000, information technology went from less than 10 per cent of an American company’s capital equipment budget to 45 per cent (Carr, 2009). With the regular installing, configuring and updating of software, and with computer resources quickly becoming outdated, the outsourcing of computer platforms can be seen as the smart solution; especially as much of the current computing infrastructure sits idle most of the time (Wang et al., 2010). Using the network rather than the desktop as a platform means that organizations and individuals with the necessary skills can tap into the computing power they need as and when they need it, rather than constantly having to update systems and software to deal with peak demand; this would potentially allow new innovative Internet services without large capital outlays (Armbrust et al., 2009).

Is Sub 45 10k good?
Is Sub 45 10k good?

If you're able to run a 10k in between 43 to 50 minutes, you're in excellent health! Even a 10k in 50 minutes is classed as an excellent time! May...

Read More »
How do you get 100 followers on Instagram in 24 hours?
How do you get 100 followers on Instagram in 24 hours?

Tips to Get 100 New Instagram Followers in 24 Hours Ask Your Family and Friends to Follow You. Follow Other Users and Engage With Them by Liking...

Read More »

Widespread usage of the cloud for both storage and processing power can be seen as a natural destination for the web of today, and in a ‘Future of the Internet’ survey carried out as part of the Pew Internet & American Life Project, 71 per cent of technology experts participating in the survey agreed with the statement that: ‘By 2020, most people won’t do their work with software running on a general-purpose PC. Instead, they will work in Internet-based applications such as Google Docs, and in applications run from smartphones’ (Anderson and Rainie, 2010). There are still a number of challenges ahead before cloud computing becomes more widely adopted, especially in the area of privacy and confidentiality, where there have been several high profile failures (Ryan, 2011); in September 2011, for example, Dropbox, a file hosting service, briefly allowed unauthorized access to accounts (Dropbox, 2011). Nonetheless, such concerns are likely to be quickly overcome when users find that there are significant advantages to the online services that are offered. Not everything is moving to the cloud, however, and it is noticeable that Pew’s survey grouped Internet-based applications together with applications run from smartphones. Although there has been a change in the way increasing numbers of people access software through their desktops, the rise of the app store has created a renaissance of the downloaded application through the widespread use of mobile phones and tablets. No prediction of the number of future downloads currently seems too high, with one report claiming mobile app downloads will reach 98 billion by 2015 (Perez, 2011). Like computers, smartphones and mobile phones that run a high-level operating system (such as the Apple iPhone or the Android), are capable of running multiple programs at the same time, and are an increasingly significant part of the mobile phone market (Nielsen, 2009). They are merging people’s home and work lives, they are always turned on and smartphone users reportedly spend less time doing other activities after getting a smartphone (Ofcom, 2011). Between O’Reilly’s original paper in 2005, and his revisiting of the subject in 2009, the focus had moved from the web as a platform to the network as a platform, and the always-turned-on smartphone with an increasing number of sensors is an ideal way to connect to the network. Although mobile applications show us that it is possible to do this without putting the products on the web, in the same way that there is a risk in the dominance of a single social network site such as Facebook, there is the inevitable risk of vendors taking too much control of the phone as a platform; Apple, for example, has a censorship policy for apps and does not allow certain types of content in the app store (Dredge, 2011). Time will tell whether downloaded applications will be a short-term option until improvements in mobile telecommunications and greater functionality enabled by HTML5 will push many of these applications to the cloud, or whether the simplicity it provides as a way of getting money for applications will see developers continue to focus on the downloaded application.

Which marketing has highest salary?
Which marketing has highest salary?

What are the Highest Paying Marketing Jobs? Corporate Communications Director. ... Marketing Research Director. ... Director of Email Marketing....

Read More »
How many CR is 1 billion?
How many CR is 1 billion?

100 crores In terms of crores, 1 billion is equivalent to 100 crores, i.e. 1 bn (1 b) = 1,000,000,000.

Read More »
What are the 5 types of scheduling?
What are the 5 types of scheduling?

They include time-specified scheduling, wave scheduling, modified wave scheduling, double booking, and open booking. Jan 7, 2022

Read More »
What are the six basic marketing principles?
What are the six basic marketing principles?

The 7 key marketing principles are: Product. Price. Place. Promotion. People. Process (or Positioning) Physical Evidence (or Packaging) May 10, 2022

Read More »