Olivia Naire

wpid-social-media1-580x580.jpg

How to Make Social Media work for your business

Posted by | Natural Search SEO | No Comments

How to make Social Media content work for you & your brand

social-media

 

Quality social media content can set you apart from competitors,  keep your brand in customer’s “top of mind” and enhance your positioning as a leader provided that content is the center of your online marketing strategy. Setting clear goals through different aspects of social media content management is a prerequisite for a successful online strategy.

 

Blogposts, status updates, videos, contests, tweets, poll questions, info graphics and photos… Choose responsibly the ideal social media channel and always consider your business objectives. Monitor your audience’s presence and activities so as to avoid creating an account to every available social channel without a clear focus.

 

Post at the optimal time using monitoring tools that help you define best time to post and express significant points by conveying your main message within the first characters. As soon as you determine the frequency and time of posting you can create a calendar using tools like Hootsuite or SproutSocial to administer and manage social media channels and all engagement activities: comments, new fans and followers, fans interaction and responding. Decide the frequency of your posts: weekday posts, weekend posts or all days of the week?

 

Define your audience by asking: Why, when, who? When you know the answer you can create your brand’s online personality and communicate your point of view as you interact with your fans/followers accordingly.

 

Stimulate interest through a varied, yet focused topic selection: education, entertainment, inspiration, promotion or a combination of all. Use how-to suggestions, questions, links to valuable sources, other people’s sources and promotional content in a valuable mixture.

 

Establish yourself as a leader through selective content and gain followers’ trust. Stay in consistence with your brand’s vision, image and character and always be original.

 

Social platforms-networks are channels for promotional-advertising content that should only be expressed in a subtle and intelligent manner. Focus finally on a constant user-engagement by stimulating reciprocal activity attracting users to share life experiences in connection to your brand.

 

Important note: Always be true and original when it comes to your social media content!

How to Analyse Social Media Traffic with Google Analytics

Posted by | Natural Search SEO | No Comments

How to Analyse Social Media Traffic with Google Analytics

wpid-Analytics-580x350.jpg

 

Most likely you have already setup a Google analytics account in order to analyze your website’s traffic.

 

Today we will see together a step by step guide on how to identify, segment and analyze your social media traffic by utilizing the Google Analytics platform. 

 

The first step is to discover the social media sites that have sent the most traffic to your site during the last 30 days.

 

To perform this action, go to Traffic Sources – Sources – Referrals.

 

Referrals

 

On the right side of your screen you will find the full list of your referral sources.

 

For our example site the top 3 referral sources are:

 

Referral-Sources

 

In order to create your advanced reports you have two options.

 

Create one advanced report for each social media source or create an aggregated advanced report for all your social media sources.

 

Let’s see first how to create an advanced report for each of your social media sources.

 

To perform this action, click the “Advanced Segments” button and then click + New Custom Segment.

 

Advanced-segments-Referral-Traffic

 

Let’s create an advanced segment for Facebook social media traffic first.

 

Add a name for your segment (we chose “Facebook” as a name) and include all “facebook” sources that sent traffic your site. Specifically:

Then click “Test Segment” to verify that the advanced report works properly and click “Save Segment”.

Referral-Traffic

 

In the same way, you can create an advanced report for each of your social media traffic sources (e.g twitter, pinterest, google+, linkedin, etc).

 

Now, in order to create an aggregated advanced report for all your social media sources you click again the “Advanced Segments” button and then click + New Custom Segment.

 

The difference now is that we will create this segment by selecting a “Matching RegExp” as an option from the dropdown menu. Then we will add all social media traffic sources separated by a pipeline (|), just like the code below:

facebook|t.co|plus.url.google|hootsuite|bit.ly|linkedin|youtube|delicious|stumbleupon

 

Note: Make sure that there are no spaces between words or pipelines.

 

 

You can also include other social media sites that send traffic to your website.

 

Then click “Test Segment” to verify that the advanced report works properly and click “Save Segment”.

 

Segment-Creation

 

Now that you have setup your advanced reports you are ready to analyze your social media traffic by whichever metric you want.

 

For example, you can select one or more advanced reports from your list and compare them for specific metrics like Engagement, user location or what devices they have used to access your website.

 

To do that, click the “Advanced Segments” button, then choose one or more of your “Custom Segments” and click Apply.

 

Now you are set to choose whichever metric you want (e.g Engagement) in order to analyze/compare your social media sources.

 

Advanced-segmentation

 

So, log into your Google analytics account:

Identify your social media traffic sourcesCreate your advanced reports

…and start analyzing your social media traffic today!

 

The more you get to know your visitors and the more you serve them, the more loyal they will become towards your website-brand.

 

Apart from keeping a track of the performance of your social media activity via advanced segmentation, you can use the Reputation Tracker of Web SEO Analytics, according to which you are going to have more valuable insight of the social media presence of your website. It enables you  to see the total number of shares, comments and mentions of your page on Facebook, on Twitter, on social bookmarking services (like StumbleUpon)

wpid-majesticseo2-580x362.png

The Importance of Backlinks in 2013: Dixon Jones Interview

Posted by | Natural Search SEO | No Comments

The Importance of Backlinks in 2013: Dixon Jones Interview

This is an interview with Dixon Jones. Dixon is the Marketing Director of MajesticSEO and a founding Director of Receptional, a leading UK Internet Marketing Consultancy. Dixon has been involved in running businesses since 1988 and has been involved in running and marketing online businesses since 1997. Follow Dixon on Twitter

 

Today we have with us Dixon Jones. Dixon is the Marketing Director at MajesticSEO and the Founding Director of Receptional LTD. The topic of this interview is revolved around the future of backlinks, effective link building methods, high quality backlinks and interesting new features coming up from Majestic SEO.

 

For me, a mention from a person on Twitter is EXACTLY the same meaning as a link. If the Twitter user is a robot, or spam, that nobody will listen or care, just like a link on a computer generated low quality web page. So yes – social signals count… but they are still links. I have always said that a “link” is a web based word for “relationship”.

 

I think that we (as SEOs) got a little too fixated on anchor text back in the days when we could pull this data from search engines and it has taken until Penguin for us to see that we were concentrating too much on this signal. I think a little bit of anchor text is still a good thing… just enough to give context. After that, you need several (but not hundreds) of respectable web pages citing your web page. The rest is crap.

 

I disagree that attracting high quality links is one of the hardest tasks. I believe the HARDEST task is building up a good reputation for innovation or for thought leadership or for downright integrity. If you cannot get people saying good things about you in the bars and offices then you will also not attract good links. The problem for many SEOs is that they do not have the social skills to have a conversation in a bar or an office! So they need to pair up with people with stronger social skills and realize that a good link starts with mutual respect.

 

Here’s what I can confirm. When Majestic has a big announcement, we tell 10-20 influential bloggers. People like State of Search, Search Engine Journal. They can either publish the news or not – that’s their choice as there is no financial inducement to do so. However – when they DO publish, we tend to get more traffic (from them and from Search engines) and when they don’t – it is USUALLY because we gave them something that they thought was not interesting. What we do try to do is avoid giving the information to the influential people at the same time as spamming it across blogs and forums. If we were to do press releases to large networks instead of named individuals, then most likely Google would not see the message as coming from authority sites. Even worse (MUCH WORSE) the influential bloggers would see that they just reported a story which was already live on low quality sites. They will only do that once… then you are on borrowed time. So Link Bait and Guest Blogging indeed still works – but the quality of the blog and its content FOCUS is very important. I see many low quality blogs which seem to talk about clothing in one post and Virtual hosting in the next. That is unlikely to be a focused blog and at this point in time, none of the link providers give CONTEXT to the sites – although there is increasing context surrounding the link itself. Consider this… what if a search engine categorized every site based on the most prominent keywords. The site has a fixed amount of juice (say from 0-100) Now if a site worth 10 points talks about 2 things religiously, then each page in context will be worth more than a site worth 100 that talks about 100 things. Because data is not yet visualized in context especially well (although anchor text spread helps), we miss the obvious.

majestic_seo2

One great way is to use this as a way to compare whether a keyword is better tracked down through organic means or through PPC means. The Google Keyword Tool gives (poor) information about how competitive a phrase is on PAID channels – and SEOs have (to date) been using this data as a proxy for organic keyword analysis. But in practice, there are differences between what people wrote about on the web and what people type into search engines.

 

Wow – now I look back at it, we were pretty busy last year weren’t we? Looking back at it, I am amazed that Flow Metrics were only launched in May or so:

 

So after launching Flow Metrics in May, which gave us by far the most solid metrics for evaluating a URL’s importance on the fly, we were able to dramatically increase the size of our Fresh index in June. This is a process that we may get back to in 2013 if needed. We were also able to start getting our data into smaller development tools through our OpenApps technology which meant that developers did not have to spend large amounts on our API if they were prepared to let their users have direct accounts with Majestic. This culminated in the Chrome Backlink Analyzer Extension, a GoogleDocs spreadsheet and integration into SEOGadget’s spreadsheets and we hope other developers will create new apps using this technology in 2013.

 

As you point out, we launched a global anchor text checker in July and then we were able to show you anchor text by site in the site explorer later in the year. August saw the launch of tracking reports. These are still underused – which is a shame because they are free for subscribers. They track the Flow Metrics of any URL over time, but you need to set the report running to keep track, as we do not want to record historical flow metrics data for 4 trillion URLS when our customers only care about a tiny fraction of these. The biggie, though, was the ability to show new and lost links day by day, straight into the Site Explorer. We wanted our users to come to our site regularly – not just once a month. Looking at the new/lost links chart gives people a reason to do that. Then – in December – we also launched “Bucketlists”. These are great for link research… whenever you are looking for good (or bad) links, you can add ones that interest you to a backlist, for later analysis or to pass to another member of the team to contact. So plenty of development… I am sure I missed lots… and I think we will try and build on most of those innovations in 2013 if we can keep up the momentum.

 

I would encourage users to use the feedback button now on every page on the site (except the homepage) if they see anything that we should be doing better. We are compiling this feedback and using it to help direct our efforts.

 

Dixon.

wpid-Search-engine-optimisation-580x438.jpg

How to Create an effective SEO Competitor Analysis – Part 2

Posted by | Natural Search SEO | No Comments

How to Create an effective SEO Competitor Analysis – Part 2

Search-engine-optimisation

 

On the first blog post we covered the  1st part of SEO competitor analysis including a range of topics (meta-tags review, keywords health-checks, internal linking checks, URL reviews) that you should pay attention to when you conduct an onsite SEO competitor analysis.

 

Concerning the second part there are equally important segments to be taken into consideration as seen below.

 

A poor website development or a poorly designed website can affect the page load time and by extension can seriously affect the user experience while on the website and in the end the onsite SEO results. If you have discerned this type of problem on your competitors’ website make sure to avoid such issues when it comes to your own website.

 

The images should be descriptive, as this has a significant impact on the SEO process. If your competitors have an image which is for example, stored under the title img881, then the onsite SEO is not successful. If the image is under the name Athens-Acropolis (in other words descriptive and relevant) then the onsite SEO can be considered successful so you would rather follow this particular approach.

 

You should focus your attention on the Headers (H1, H2, H3 etc) which are on the site, as they play a significant role on the onsite SEO. For example if you see on your competitors’ pages, headers like, Luxury Rooms in Crete, then it’s a strategy that you would like to implement accordingly.

 

Onsite-SEO-help

 

Search engines crawl on your pages and look for sitemaps facilitating the process of indexing your pages. Look for sitemaps and if your competitors do not have any, then you have the chance to be one step ahead of them in terms of onsite SEO. For this purpose, in case you do not have sitemaps, it is advisable to generate sitemaps and submit them on Google and Bing respectively.

 

A well designed and well developed website means that it can easily be analysed by search engines. So if you want to see how your competitor’s website is seen by search engines, you are advised to use the HTML validator (according to the W3C standards).

 

It is widely known, that Google (and Bing) do not like repetition regarding the content of a website. Google rewards originality and fresh content. Thus, it is essential to diagnose if your competitors have duplicated content (and if your website does) or they lack of content. The best possible way to make this diagnosis, especially when you have an online shop, is the use of the Duplicate Content tool of  Web SEO Analytics where you can make a comparison of 2 websites (of your competitors or your own)  and see whether their content is repetitive/duplicated or not. In this way, it will be easier for you to design your own strategy by creating a website with fresh and original content.

 

With the completion of this analysis you will be able to find and eliminate your weaknesses and with the onsite SEO competitor analysis, you will manage to gradually strengthen your position within the market…  The Duplicate Content tool is extremely useful for those of you who own Online Shops (e.g. electronics). Despite the fact that the technical specifications of a camera for example are specific and it’s difficult to find different words to describe the same features, according to Google it is encouraged to use fresh, unduplicated content. Thus, the Duplicate Content tool helps you in an automated way to minimize the hassle of finding out whether or not your website has duplicate content in relation to other pages.

wpid-swot-analysis.jpg

How to Create an effective SEO Competitor Analysis – Part 1

Posted by | Natural Search SEO | No Comments

How to Create an effective SEO Competitor Analysis – Part 1

swot analysis

 

The SWOT analysis (Strengths, Weaknesses, Opportunities, Threats), is an integral part of the SEO process and it should not be overlooked. Before you start implementing your SEO strategy in practice and see quantifiable results, it is vital to start working on your SWOT analysis. An essential element of your SWOT analysis is your Onsite SEO Competitor Analysis. According to Sun Tzu, ‘’knowing your enemy is half the battle’’ and adjusting this quote in today’s corporate world, an effective and thorough Onsite SEO Competitor Analysis can offer in the long –run a major strategic advantage towards your competitors!

 

It is possible to conduct an onsite Search Engine Optimization because the websites of our competitors are available throughout the web. Additionally, there are tools that allow us to measure the efficiency of our competitors’ website in relation to our own. There are two cases to consider when conducting a Competitor Analysis and then adjust your strategy accordingly:

 

• For the first case, you can conduct a competitor analysis when you have a website by comparing it with your competitors’ websites.
• For the second case, alternatively, you can conduct a competitor analysis, when you don’t have a website. In the case in point you basically start off with your weaknesses. However, keep in mind that the weaknesses can be at the same time potential opportunities that can turn into strengths. In this blog post (separated in 2 parts), you are going to find out some simple and effective tips for the first case.

 

The list below states some of the essential things that we should pay attention to on our own websites. The ‘’juicy’’ part is related to the on-site SEO competitor analysis, cause knowing the strengths and weaknesses of our competitors will help us implement the best possible strategy for maximised and optimised results! Even if your competitors are established and traditional players within the market, yet with a poor SEO strategy, it is possible to outrank them once you define a clear strategy, starting with your onsite SEO competitor analysis.

 

On your competitors’ website on their source code, page by page, search and define the keywords that they chose to use. Note if there are any structural problems in terms of characters; (for example the limit of characters including spacing should not be more than 250 characters), so if they exceed the number of characters then it is obvious that their SEO strategy is poorly developed. Additionally you can use the SERP Analysis of Web SEO Analytics in order to analyse the competition on search engine results and become familiar with your competitor’s SEO strategy.

 

The structure of a URL is of vital importance so check out your competitor’s URL structure. If the URL is too long, with endless subdirectories then again their onsite SEO strategy is poorly developed. A best practice example is this: www.example.com/luxury-accommodation , instead of a long URL structure as: www.example.com/ Greece/Athensacropolishotels luxury-accommodation/. In other words, your URL should be short, descriptive and keyword-rich so as to enhance the chances for an effective  onsite competitor analysis.

 

SEO analysis

 

The Meta title should not be more than 65 characters. The most important keywords should be before the pipeline while your competitor’s brand name should be after the pipeline. If that’s not the case avoid following their lead cause ideally you should aim for something like this: SEO Keywords | Brand Name

 

Meta Description (which can be found again on the source code), should not exceed the limit of 160 characters, including the spacing. Your attention should be focussed on whether your competitors have eye-catching keywords which are going to enhance their ranking position on Google. If you see that they don’t have eye-catching keywords, then it’s most likely that you have the competitive advantage.

 

The Meta keyword description review was covered earlier in the Keyword health-checks section. Subsequently, it is important not to exceed the 250 characters limit that Google has set. Also make sure you check if your competitor uses the same keywords in all pages of his website. If the same keywords are used, then we have the so-called phenomenon of keyword cannibalisation.

 

The best practice for internal linking requires the use of related links so as to navigate the user through your site and direct him back to your homepage. If your competitors have not implemented this practice it is obvious that their SEO strategy is not very well structured. For this purpose, you can optimize the link structure for your pages by using the Link Structure of Web SEO Analytics.

 

You should also bear in mind that the number of internal links should not be more than 100 (according to Google). It is worth checking if your competitors exceed this number or in contrary if they have no links at all!

 

Stay tuned for the 2nd part of the Onsite SEO competitor analysis in which you will read about, the Page Load Time, Image Optimization, Heading Tags, XML Sitemap, Website Coding/Code Quality Checks, and Accessibility of the Website, Duplicate Content Checks, and Usability Checks…

wpid-Screen-Shot-2012-11-27-at-70748-AM.png

German Law Will Allow Free “Snippets” By Search Engines

Posted by | Natural Search SEO | No Comments

Google GermanyThe good news for search engines like Google is a proposed German copyright law won’t require them to pay to show short summaries of news content. However, uncertainty remains about how much might be “too much” and require a license. The new law is expected to pass on Friday.

 

Der Spiegel explains more about the change:

 

Google will still be permitted to use “snippets” of content from publisher’s web sites in its search results….
What the new draft does not stipulate, however, is the precise definition of the length permitted.

 

The draft bill introducing an ancilliary copyright for press publishers in Germany (Leistungsschutzrecht or LSR) goes to a final vote at 1oam Germany time on Friday. Below is my background about the hearings that happened this week, which in part lead to the snippets change.

 

Despite all the procedural and constitutional objections to the Leistungsschutz bill, there are also a couple of technical and political ones. Critics (and there are plenty of them) raise concerns that the collateral damage by this change in copyright will hurt search engines, innovation in general and especially smaller press publishers.They point to ambiguous language in the bill that will cause legal uncertainty and lawsuits that will take years to be settled.

 

The German government and supporters of the bill have done little to address these objections. On Saturday, I published an advance copy of the answers by the government in response to a letter of inquiry by the opposition Left Party. There is a continuing pattern in the government’s response referring open questions to be settled by courts or simply by ignoring the question.

 

One of the last opportunites to discuss the mechanisms of this ancilliary right within the parliament lasted for 90 minutes Wednesday at an expert hearing at the subcommittee for New Media (Unterausschuss Neue Medien, UANM) at the German Parliament.

 

Public invitations for this hearing were sent out only a couple of days ago, after two weeks of behind-the-curtain negotiation between the governing factions in parliament (Christian Democrats (CDU/CSU) and Liberal Democrats (FDP)) and the opposition factions (Social Democrats, Left Party and Green Party).

 

CDU/CSU and FDP had previously refused to schedule another hearing next to the judiciary committee hearing in January, saying that all questions could also be addressed in this expert hearing. As it turned out, there were a couple of technical questions that could not be addressed, due to the fact that none of the invited experts in the judiciary committee hearing were experts in the field of technology. How could anyone have known that there are at least two kinds of experts out there!

 

Invited experts were

Dr. Wieland Holfelder, engineer at Google (there was a consensus agreement by the committee members  that he could pass non-technical questions to legal counsel Arnd Heller from Google, who was sitting behind him)Dr. Thomas Höppner, representative from the press publishers’ association BDZVProf. Dirk Lewandowski, University of Applied Sciences, HamburgMichael Steidl, International Press Telecommunications Council (IPTC), London

Two experts were invited by the majority factions (Höppner and Steidl), two experts were invited by the opposition (Holfelder and Lewandowski). The procedure was following the usual procedures: There were three rounds of questions for members of parliament, two questions from each faction to one expert or one question to two experts. There was no opportunity for introductory statements by the experts and no strictly enforced time limit on answers.

 

So, in order for an expert to be allowed to speak, he has to be given a question from a member of parliament. An expert is not allowed to ask questions or offer refutations to other experts directly. This results in a strategy that each side is going to give softball questions to their own experts and potentially compromising questions to the experts from the other side. It has to be assumed at many hearings that questions were exchanged before the meeting and that there is some level of expectation on what the answer might be. This is exceptionally true for partisan experts whose employers directly benefit from or suffer by the outcome of this legislative process.

 

Some of the softball questions provided the experts the opportunity to explain how robots.txt works (Holfelder) or explain the shortcomings of robots.txt (Steidl and Höppner).

 

Holfelder introduced himself as engineer who implemented his own web crawler 14 years ago. He distributed printouts of robots.txt examples and the resulting snippets in the search engine results pages. He explained additional meta-tags that Google uses to add or remove content from the Google (or any other of the leading search engines). To some extend, his presentation felt both verbose and strangely elementary. In an ideal world, none of this information would have been new to a subcommittee that specifically focusses on such topics.

 

Petra Sitte, (Left Party) had asked Holfelder to comment on ACAP, a protocol that was proposed by a few publishers and has failed to get any meaningful level of acceptance by the market. Holfelder provided a few examples in which implementing ACAP will be prone to spammers, as it mandates the way in which provided descriptions have to be shown.

 

Konstantin von Notz (Green Party) asked Holfelder whether it was possible for a search engine provider to detect whether specitic content on a web site is covered by this LSR or not. This is – in my opinion – one of the most important questions of this bill because it outlines the potential for huge collateral damage or legal uncertainty over the coming years.

 

The ancilliary copyright is awarded to a press publisher (a press publisher is defined as anyone who does what press usually does) for his press product (a product of what a press publisher usually does). It exists next to copyright awarded to the author who can license his/her content to anyone else. It means that it is not the text itself that defines whether conent is covered by the LSR.

 

Here is an example: A journalist maintains his  personal web site in order to advertise for his services as a freelancer. He has a selection of half a dozend of his articles on his web site that help to inform potential customers on his journalistic skills. These articles are of course protected by copyright. They will not, however, be covered by the ancilliary copyright because he is not a press publisher. The very same texts on the web site of a magazine’s web site will be covered by the LSR. How can a search engine determine if text on a web site is subject to both copyright *and* LSR?

 

Holfelder replied that Google has a couple of heuristics to determine whether a certain page is provided by a press publisher. However, this law has no provisions for “honest mistakes”. If Google failes to detect LSR content and does not receive prior permission to index such content, Google faces legal consequences. There is no such things as a “warning shot” or an obligation by the press publisher to proactively inform a search engine whether it things a certain page is LSR covered. This is the legal equivalent of a minefield.

 

Holfelder stated that a search engine would in this scenario tend towards overblocking in order to avoid a lawsuit for violating the LSR.

 

Höppner, the press publishers’ expert spent his time mocking a comparison about this bill that involves taxis and restaurants. He then stated how services such as Google News substitute visiting the original pages, with some rambling about a Google service called “Google Knowledge”. It was hard to tell whether he meant the failed Google Know project or the Google Knowledge Graph in the standard Google search.

 

His main argument on robots.txt was a passive-aggressive one. Publishers do not like robots.txt per se, they merely use it to fight for the last crumbs that search behemoths like Google have left them. In other words, if a press publisher is providing meta description text or Twitter cards, this should not be seen as some kind of agreement to actually use this text in order to build snippets in a search engine. I severely doubt that this position would hold in court or among the motivation of press publishers.

 

Prof Lewandowski’s contribution to the hearing was an interesting one as he is the first expert in a long time who does not seem to have an agenda with respecto to the LSR. His viewed were balanced, nuanced ones, highlighting both the high level of acceptance of robots.txt and some of its shortcomings. He pointed out that at least at Google News, the limited amount of sources and the opt-in-meachnism (yes, it’s more complicated than that) of Google News would permit running such a service in an LSR world.

 

Steidl used his time to explain IPTC’s contribution to the world of standards and mentioning the RightsML project which is in active development. He criticised robots.txt for being without a governing organisation and for failing to express rights on a sub-article level.

 

Both Google and the press publishers were not very eager to present actual numbers in Google News usage or how visitors are directed to third party web sites.

 

In round two, Google’s legal counsel Haller was asked how Google will react to this bill if enacted. He replied that Google does not know the final version of this bill, and that Google has not decided yet on how to implement it. He pointed out that his companry would have to not only deal with publishers from Germany but from the entire European economic area who could exercise their own LSR rights against Google.

wpid-mobile-local-seo-580x483.jpg

Local SEO Quick Wins: How to succeed using local SEO

Posted by | Natural Search SEO | No Comments

mobile-local-seo

The rapid use of smartphones and tablets gives a unique opportunity to SME (Small and Medium Enterprises) to optimize their websites locally for smartphone/tablet devices by reaching an audience like never before. Later on, some of these methods are going to be displayed for effective reach of this audience.

The growing number of users that use their smartphone or tablet devices increases exponentially which can be proven by various research findings. Recent findings from Pew Research Centre show that:

As of August-September 2012:

• 85% of American adults have a cell phone
• 45% of American adults have a smartphone
• 25% of American adults have a tablet computer

Of course the figures above are not based in theory, but practice itself proves that mobile is on the rise and thus it’s essential for businesses to invest on their mobile presence and particularly when it comes on Local SEO.

In order to see practically the exponential growth of mobile, you can see a comparison of a client with mobile presence… we are comparing its 2012 and 2011 mobile traffic. As it appears to be on the screenshot, there’s an exponential growth on mobile traffic:

Visits: +101.07%
Unique Visitors: +107.62%

mobile-stats

If you already have set up a Google Analytics, how are you going to see this?

(Click) Audience  Overview –> (choose) Advanced Segments (on default segments) –> (tick the box) Mobile Traffic

In this way you are going to see the data available for your mobile traffic.

What SMEs haven’t realised yet, is the fact that users are changing their user search behaviour in search engines (when they perform a query) on their mobiles or tablets.

Similarly you can do the same thing for your tablet traffic!

What can be considered as common knowledge is the fact that users (who use tablets/smartphones) prefer to tap fewer keyphrases while they perform a query on mobile devices; on the other hand when it comes to desktop devices their online search behaviour changes.

So taking into consideration the aforementioned factors:

Undoubtedly, the number of mobile/tablet users in developed countries is increasing exponentially. Additionally, the number of mobile and tablet a user in emerging economies is rising fast. These two resultants lead to the conclusion that local business should invest on Local SEOMobile user behaviour: taking into consideration minor (but determining) factors, when mobile users tap on their mobile/smartphone they don’t perform lengthy queries but they are laconic. So Local SMEs should take this factor into consideration when they perform their local SEO.

Now it’s time to take into consideration all the local SEO actions that needs to be done in order to achieve optimized results for your local business.

On-site optimization: Follow the same techniques that you would do for your website: mobile URLs, use of metatags, headings,  keywords, but this time you should target locally, use of social media/sharing buttons, use of contact details at a discernible position, ‘’call’’ button

For your initial search it is advisable to use the Adwords Keyword Tool by Google, by filtering the info that you want. So for your local business, you’d better ‘’All Mobile Devices’’, as you see on the example below and start conducting local keyword research.

google-keyword-tool

Make sure to take the optimal decision for your local business mobile site, by taking into consideration an indicative list of domains/subdomains according to which your final URL is going to appear:

m.example.commobile.example.comexample.mobi

Make sure that you generate a mobile sitemap, exactly the way you would do for your website.

Don’t forget to verify your mobile site on Bing and Google Webmaster Tools respectively and then proceed to the submission of the URL of your mobile website on Bing and Google Webmaster Tools respectively.

Like regular websites (on our desktops), Google (and Bing) crawl the content of our website. Similarly, Google has Google-Bot Mobile that crawls the mobile content of the site of your local business.

Are you worried on how mobile-friendly your website is and it’s going to affect the mobile experience of your users? You can validate it, with the use of W3C Mobile Checker

It is likely that the local SEO of your mobile site might be affected by the load time it takes. If it takes more than 4-5 seconds it might be a bad user experience and this damages your mobile presence. For this purpose, just enter your mobile site on the PageSpeed Insights  and you’ll get optimal suggestions in order to make your site faster.

If you want the mobile site of your local business to be found, it is advisable that apart from the onsite optimization process to implement a link building approach for optimal results for your local business.

Some factors that you need to take into consideration for your offsite optimization process for your local business are the below:

If you have a blog, write one for the needs of your business  and if you want to add tags, it’s better to use your area and the product that is related to your business. For example, just add tags for: Athens, the acropolis, coffee shop, cafeteria, etc.

If it’s possible try to add your business into location based apps that are compatible to tablet and smartphone devices (using operating systems like iOs, Android etc). Indicatively some of the location based apps that you can take into consideration are the below:

CellfireThe Coupons AppWhereCheckpoints Swirl Point Inside

Add/claim your business in local directories, for optimized local results like Yelp, Yahoo, Yellowpages.com, Superpages.com etc.

Include your business on Foursquare (more than 25 million users use Foursquare, 2.5 billion past check-ins)

Of course your local SEO strategy should include Google Maps and G+, by adding your business (address, telephone, generally contact details, keywords related to your business). Don’t forget that Google has also released a mobile app for Local. Once the user gives his location on Local, then he will be able to find Restaurants, Cafes, Pubs, Attractions.

If you want to dig into depth for high quality, websites,  blogs, directories, or URL submissions websites related to your business that will help you enhance the local presence of your business, then it is strongly advised to use the Backlink Hunter of Web SEO Analytics.

Just place the (local) keyword that your business is related to (e.g. Greek restaurants London) , choose the type of site (blog, website etc.) and you will have available all those sites which are directly related to the type of your business. The Backlink Hunter will show how relevant this website is, the domain score, the domain authority, the page rank, even contact details (in order to make sure that these sites have a mobile version as well)

Along with your locally-SEO optimized actions, now with Google you can micro-geotarget a specific area.

Let’s assume that you own a business related to the travel/hotel industry.

Now you can aim at a specific region (e.g. Heathrow Airport, exclusively) and now you further filter the reach of your to tablet users only (who are on Heathrow Airport only)

adwords-location

You can choose the filters that you are going to put in order micro-target the audience of the area, according to:

Mobile/tablet devices,Operating systemsDevice modelsOperators and wifi etc.

adwords

wpid-Research2-580x483.jpg

Content Marketing for SEO – Does it really work? 4 Step Web Content Development Strategy

Posted by | Natural Search SEO | No Comments

Research

First of all, what is Web Content Development?

Web content development is the procedure of researching & gathering valuable information, defining objectives, finding & evaluating keywords to be used & organizing the structure for writing & finally publishing a content.

Undoubtedly, in the world of internet and web development, content is the king. Content is the medium through which you can provide interesting and valuable information to your audience. By providing well optimized information, your content becomes visible by search engines, increases ranking results and therefore it is easily found and read.

So, content development is all about keywords & appealing information! In this blog post you will find all the principal steps needed for setting up a right SEO strategy for a well optimized content.

Specify your competitors & extract information from competitors’ sources, regarding the subject you want to write about. Gather as much information as you can! The initial research will help you significantly find ideas concerning which keywords to use when developing your content.

You must specify the main goals (products or services) that need to be promoted through your content (blog post, article, image, video or other interactive forms). If the content to be developed refers to a website, then define the pages of your website that need to be optimized. It is really important and take it into serious consideration, that each page of the website needs to be optimized for unique keywords / keyword phases that are related to one and only subject.

Keywords

Make a list of possible keyword phrases that are going to be used for the optimized content. If you are going to create content for a website, then make a different list of keywords, for each page you are going to optimize. The keyword phases listed should be both head terms and long tailed ones. Be inspired from:

The initial research you have performed & the objectives & goals you have set. Ask yourself:

Which are the keywords / keyword phrases that best describe the product or service that needs to be promoted?What kind of terms have you found through the competitors’ sources that can be used to describe the relevant product or service?Which terms do your competitors use? Be careful to select only those keywords that are highly relative to your goals and objectives. And the fact that you are selecting them for your initial list does not necessarily mean that you are going to use them as well. The list will be finalized after evaluating keywords performanceIf you were a user, which search terms would you choose to use in order to find the specific promoted information?

Google analytics account excluding the branded terms. In the organic results, exclude the branded terms and narrow down your keywords list with the keywords that have brought the highest traffic.

PPC keywords (Pay Per Click) provided by Google Analytics. Write down the PPC keywords that have generated the most traffic.

So, after you have ended up with a list of keywords: Use an effective keyword tool such as google adwords Keyword Tool or the very effective Keyword Research tool of Web SEO Analytics and check your keyword phrases performance. The greater the number of searches is, the more popular a keyword is. The greater the competition is the more difficult it is to easily gain good ranking results for the specific keyword phrases.

After the research, select the best keywords / keyword phrases that define your content according to relevancy & query popularity.

Then, use the Keyword Difficulty of Web SEO Analytics so as to evaluate the possibilities of achieving good rankings for each of the selected keywords.

By combining both the Keyword Research and Keyword Difficulty tools you will be able not only to find all the popular keywords that are related to your specific subject but also to decide on the keywords with which you will more likely accomplish high rankings in search engine results.

After you have ended up with the final list of the top keywords, try to spread them through the content accordingly. How?

Keyword Structure

1. There is no ideal number of times to use a keyword on a web page. And if this still remains confusing, it is advisable that the density of each keyword/keyword phrase should not be more than 3 times, per 300 words (e.g. page of a website). The key is to make the selected keyword phrases appear naturally inside the content of each page and at a reasonable rate.

2. Try to use the specific keywords / keyword phrases on important positions inside the page such as:

The URLThe titleThe first sentence or at least the first paragraphHeadings and subheadingsImage file names and alt textsMeta (title, description, keywords)Text links to related content inside the website

The final step is to take a pen and write down the article! When it is ready publish it and share it through the social media.

ARE YOU GETTING EXCITED? Get in Touch
Ver peliculas online