Thursday 29 November 2012

Using Google for Online Marketing


Google has several online marketing tools which help to promote, advertise, set goals, target customers of a business online.

Check this following presentation for more details


Wednesday 28 November 2012

Effective PPC campaign tips for setup ads






Use The Keyword in the Headline as Much as You Can

This is pretty self explanatory. Every time you use a keyword in your ad, be it in the headline or anywhere else, it’ll get bolded by the system (See example below). Of course, that bolding will make the ad much easier for people to notice it, and…


Increase Multi-Word Keyword Portfolio
The success of your PPC campaign is all about keywords. It only requires a small amount of your time each day to brainstorm keywords which are relevant to your product or service and help to expand your keyword portfolio. The more relevant keywords you have the better your chances of converting more visitors to sales.

Multi-word keyword phrases attract highly targeted traffic to your website and tend to have higher rankings than single and double word keyword phrases. Coming up with as many multi-word keyword phrases as you can, will increase your sales conversion ratio since your visitors are searching for that specific information. For example, “blue shoe laces Nike sneakers” would be an example of a multi-word keyword phrase which is also commonly referred to as a long tail keyword and tells you exactly what the customer is looking for.

Monitor Your Competitors
It is smart business to keep up with what your competitors are doing with their PPC advertising campaigns. Changes in the process your competitors use could affect keyword prices as well as your ad positioning in the keyword results. Using analysis to monitor your competitors will help you to stay ahead of the game.

Include an offer in the copy. "Free," "Save $XX," or "XX% Savings" will usually lift response. You can also mention a gift or bonus.

Relevancy! Relevancy! Relevancy! In case you missed the common theme throughout: Relevancy is the most important element of PPC search engine marketing. If you ignore relevancy, you will likely be frustrated with low click-through rates and an unprofitable campaign.

Saturday 24 November 2012

Negative ON Page SEO Techniques to Avoid






·         Avoid Using "hidden" or invisible text on your page for the purpose of higher search engine placement. For example the words/text for search phrase "Widget" in the html, the font color has been set to White. The background of the page is also white. Therefore the textual content is actually there, however the words are "hidden" from the surfer. This is frowned upon by search engines and frequently results in your site being penalized
·         Avoid Using Negative
htmltags. Div tags, Div tags are division tags. Unscrupulous seo services may insert them into your page with negative x/y coordinates to place content outside of the visible page for the surfer, but the text itself is in the html page. The search engine finds the keywords in the text, yet the surfer does not see it. Again a technique to be avoided and not recommended under any circumstances
·         Avoid Cloaking or Sneaky Redirects. Cloaking refers to serving up 2 different types of content based on the visitor who is visiting. Is the visitor a regular web surfer, serve up this page. Is the visitor a search engine spider? Serve up this OTHER page specifically for the search engine spider. The other page being served up is typically garbled textual content with no meaning to a human, and is stuffed with various keywords and search phrases. Again this technique is not recommended and will likely get your site penalized or banned from search engines
·         Avoid duplicate content. Duplicate content means you create one web site, with content on topic a, and then repeat the content over and over again on multiple websites. In theory you could create one website, achieve high ranking on it, and then clog up the search engines with the same content duplicated on multiple domains. Again this is not recommended and should be avoided

Thursday 22 November 2012

SEO Optimization for mobile Websites





Configure mobile sites so that they can be indexed accurately:

It seems the world is going mobile, with many people using mobile phones on a daily basis, and a large user base searching on Google’s mobile search page. However, as a webmaster, running a mobile site and tapping into the mobile search audience isn't easy. Mobile sites not only use a different format from normal desktop sites, but the management methods and expertise required are also quite different. This results in a variety of new challenges. While many mobile sites were designed with mobile viewing in mind, they weren’t designed to be search friendly.


Here are troubleshooting tips to help ensure that your site is properly crawled and indexed:

If your web site doesn't show up in the results of a Google mobile search even using the site: operator, it may be that your site has one or both of the following issues:

1. Googlebot may not be able to find your site
Googlebot must crawl your site before it can be included in our search index. If you just created the site, we may not yet be aware of it. If that's the case, create a Mobile Sitemap and submit it to Google to inform us of the site’s existence. A Mobile Sitemap can be submitted using Google Webmaster Tools, just like a standard Sitemap.


2. Googlebot may not be able to access your site
Some mobile sites refuse access to anything but mobile phones, making it impossible for Googlebot to access the site, and therefore making the site unsearchable. Our crawler for mobile sites is "Googlebot-Mobile". If you'd like your site crawled, please allow any User-agent including "Googlebot-Mobile" to access your site (). You should also be aware that Google may change its User-agent information at any time without notice, so we don't recommend checking whether the User-agent exactly matches "Googlebot-Mobile" (the current User-agent). Instead, check whether the User-agent header contains the string "Googlebot-Mobile". You can also use DNS Lookups to verify Googlebot.


Verify that Google can recognize your mobile URLs

Once Googlebot-Mobile crawls your URLs, we then check for whether each URL is viewable on a mobile device. Pages we determine aren't viewable on a mobile phone won't be included in our mobile site index (although they may be included in the regular web index). This determination is based on a variety of factors, one of which is the "DTD (Doc Type Definition)" declaration. Check that your mobile-friendly URLs' DTD declaration is in an appropriate mobile format such as XHTML Mobile or Compact HTML (). If it's in a compatible format, the page is eligible for the mobile search index.

Wednesday 21 November 2012

Calculate the Return on Investment (ROI) for PPC Campaigns

Calculating ROI is one of the basic tenets of PPC, and yet many advertisers don’t consider it or even understand it.

A lot of advertisers perform campaign optimizations based solely on conversion rate or cost per conversion, choosing the ads and keywords with the best metric and calling it a day.

For ecommerce goal analysis the method of calculation is similar. The biggest difference in the Ecommerce data is that it’s real and accurate compared to a lead form’s estimated goal value.
In Adwords, we can look at the money we spent to reach our goals in the same time frame. Compare that with the purchases made through CPC and we have a precise number we can confidently call ROI.

Lead Generation

Conversion rates (found in Google Analytics)
Conversions (found in Google Analytics)
ECommerce

ROI (found in Google Analytics and Adwords)
Revenue (found in Google Analytics)

I’ve always found that ROI is one of those terms that has been over-used and abused by so many people and as such, there is confusion on how best to calculate it. Personally, I like to use the following formula when we are discussing the ROI for any PPC campaign:

ROI = [Contribution] / [Cost]

So to calculate Contribution for a PPC campaign:
([Your average profit per sale] x [Estimated number of Conversions]) – [PPC Spend]

To demonstrate more fully, let’s take the following example:

Monthly PPC Spend: £1,500
Average Profit per Sale: £50
Number of Conversions (Sales) per Month: 75

and so the Contribution to Margin of the PPC campaign is:
(£50 x 75) – £1500 = £2,250
and your ROI would be:
£2,250 / £1500 = 150%

Phew! But there is an easier way. We have just created an ROI estimator / calculator spreadsheet that you can now download for free. We hope that it will be a useful tool for you when reviewing your PPC campaigns.

Facebook Launches Conversion Measurement Tool

seo freelancer mumbai
Facebook began rolling out a conversion measurement tool on Friday to help marketers bridge the data gap between social ads and online sales.

Facebook Inc (NASDAQ:FB) is bringing a new and an advanced tool that will enable online retailers and marketers to track online purchases of Facebook users who have viewed their ads, reports… Reuters. This development is reportedly going to help e-marketersThe post Facebook Inc (FB) Offers A Conversion Measurement Tool To E-Marketers appeared first on ValueWalk.

Third parties such as social shopping app maker Glimpse have been offering solutions to specific aspects of the social commerce “problem” for some time, particularly the disparate data sets available to online retailers.

Daid Baser used the example of an online show retailer to demonstrate the function and outcomes of this tool. David believes that marketers can see the number of people who bought shoes, but personal identification of purchasers will remain private.

Friday 16 November 2012

Best Marketing Techniques for Your Business

Internet marketing boosts your business. It not only promotes your products and services but also helps you in building a strong relationship with your customers. Your website is the only thing which can express your brand to create a good identity. The invention of the internet has brought tremendous changes in the field of marketing. Internet marketing has moved a long way in boosting business and generating the traffic for websites. These internet marketing techniques may cost you but they provide you with good results. So invest on internet marketing to boost up your business.  Below are the top 3 internet marketing tips which boost your business.


Effective email marketing

Email marketing is a desirable internet marketing technique. Focus on the very first part of the email. The subject line must be attractive and interesting otherwise the readers will delete it even without reading it. The subject line is the one that captures the reader’s interest, therefore write your subject line in a humorous/effective way that will encourage your readers to open the email. Ensure that the prospect is all about what you desire them to do. Always personalise your emails with the first name of your prospect. Whenever the reader sees that they feel that you are addressing them personally and tend to open the mail to read it. 

Search engine optimisation

SEO is both cost effective and result effective. SEO is the best way to enhance your brand visibility. There are various ways to optimise your site according to search engines. The main thing is content. Focus on your content and try to produce high-quality, creative and informative content. Link building is another way to boost your business. You can insert a link back to your site in the articles and do guest posting. Find quality sites to exchange your links with.  

Social media marketing

Social media marketing is another better way to boost up your business. Stay active on social networking sites. Create your business a profile on social networking sites and start connecting with your target audience. You can post upcoming events and products of yours on these pages. You can also post the links of your latest blog posts on these pages so that the readers can easily click on the link to view your post. Always stay active on social networking sites and respond to the concerns and questions of your customers.

Author Bio:
This guest post was written by Shopie, a tech writer from UK who is into Finance. Catch her @financeport. Same day cash loans are instant loans which are helpful when you run short of funds.

Google's Latest Algorithm Update

This latest algo update this is being rolled out is predicted to impact around 3% of search queries, and to put that into perspective, the original Panda algorithm was said to affect around 12% of all queries. However, us SEO's have learned to take Google's percentile predictions with a pinch of salt after Matt Cutts stated that the "(not provided)" keyword would account for less than 10% of all website traffic.




Before releasing any details on the algorithm update itself, Google kindly gave us some background information on how they feel about search engine optimisation. This is likely to counter the speculation from some SEO circles when Google make an announcement, the most recent example is the speculation that followed misreporting of the "over-optimization" penalty, which Matt Cutts discussed at SXSW. There was a suggestion that his speech was perhaps 'anti-SEO', however, those who have read the transcript of listened to the talk in full will know that this couldn't be further from the truth.

In this latest blog, Google have left no room for debate as they empathically state that: "SEO can be positive and constructive", "search engine optimization can make a site more crawlable and make individual pages more accessible" and "'White hat' search engine optimizers often improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines." These are only a few examples of the positive endorsement that ethical, organic white-hat SEO received from Google in this blog. The problem G have is with those who manipulate and game the system and rob search users of the quality user experience that they expect, I of course refer to the propagators of Black Hat SEO. As mentioned, it is sites that use black hat SEO tactics and violate the Webmaster Guidelines that will be hit by this algo update, in an attempt by Google to return higher quality search results.

G aren't able to reveal specifics about the changes to how they handle certain signals, as this would leave the door open to those wanting to game the system. However, from the examples given in the blog, it seems there is a real focus on on-page aspects of webspam such as keyword stuffing, and excessive exact match outbound links. SEO Consult will also be conducting a review in an attempt to identify other metrics that this algorithm update targets.

The second screenshot in the blog seems to indicate that this is another step by Google to clamp down on the blog networks favoured by spammers to acquire anchor-text rich links. It identifies a piece of blatantly spun content with three links semantically connected to the search query 'loans', which are completely irrelevant to the content in which they are placed. This is the kind of spam that would be found on low-quality blog networks such as BuildMyRank, which was recently de-indexed by Google.



As I alluded to in the second paragraph, Matt Cutts recently spoke about an "over-optimization" penalty that is expected to be rolled out imminently. We've cleared up that this wasn't a criticism of SEO general, but again, those who abuse the system and lower the quality of results that are returned to users. We don't think that this announcement is directly linked to the over-optimisation penalty, but we expect to see that released soon, most likely with a US launch followed by a global launch, similar to how Panda was launched.

While we haven't seen any dramatic changes in the SERPs just yet (and we're not expecting to see any change for clients), we will be closely monitoring the social networks and SEO blogs for a better understanding of the initial impact of this algorithm update. We have already seen numerous people complaining in the Google Webmaster Forums and in other blog comments about their site incurring a penalty. This seems to indicate that the update has already begun rolling out, but the full impact won't be known until later this week when the update is fully rolled out.

Wednesday 14 November 2012

Latest On-page Optimization Strategy



On-page optimization is one of the most crucial aspects of your SEO strategy. If done in the right way, it can do wonders for your website’s standing within search engines. In fact, I don’t even have to tell you how important on-page optimization is; it’s something that has been marked and measured time and again. The key, however, lies in getting it done in the right way.

To take a step closer towards the ‘Right Way’, let’s go back in time. Back to the time of Classic SEO, the real old-school stuff. The classic on-page SEO had a lot to do with keyword usage. The whole idea revolved around having as much keyword instances or keyword phrase instances in and around the content that Google would start to relate with the page and rank it for the targeted keyword.

Things like placing the keyword one to two times in the page title, preferably at the start, so that Google understands it is important. Try and incorporate instances of keywords in meta descriptions and meta keywords tags. Things like placing exact match keywords a couple of times in the content body preferably in the top paragraph. Then there were other on-page elements like page headings, images and anchor text links; using the keyword instances in image ‘alt’ attributes and link ‘title’ attributes.

All the things I mentioned above, I am not saying they don’t work anymore or they are completely de-valued. But, they are not the only things that are going to help you. More precisely, on-page SEO has expanded just like everything else in SEO.

User Intent

First and foremost, really understand what you should rank for. Get your keywords right and stop optimizing for keywords, but for users. Understand what the user is looking for using a search query and what your page is delivering. Once user intent is clear, design your page to satisfy that intent. It’s very important to understand that it’s not about having instances of the user’s keyword on your page but answering their question. Treat every search query as a question and formulate web pages with the sole intent of answering the question.

Content Keyword Optimization

Having instances of the exact match keyword in the content. In a way is important, but lately, Google has been really strict about exact match keyword optimization, and there are examples of this doing more harm than good.

Modern day on-page optimization needs to focus less on keyword instances and more on keyword theme building. This means your content needs to touch up on concepts closely related to your target theme. We need to incorporate appropriate keyword variations. Not just variations, try to incorporate synonyms, long tails, semantically connected terms etc. All these things are going to provide proper signals to Google helping it understand that your page is most relevant and value adding for a targeted keyword.

To understand better, let’s take an example. Consider your page is about guitars and you want to optimize for the term ‘Guitars’. Instead of focusing efforts on having the instances of the keyword ‘Guitars’, we need to incorporate content that covers related topics and helps answer all possible questions related to guitars.

We need to incorporate content that talks about types of guitars. Acoustic guitars, electric guitars, bass guitars etc. popular guitar brands, brief about tuning a guitar or even touch upon the kinds of music that is closely associated with guitars like Rock or Blues.

Optimize Search Listings

At times, ranking on number one is not enough. People abandon search results solely for the reason that the SERP listing is not enticing enough. As mentioned before, in today’s Web search environment the listing has to sell itself. It’s not about having the keyword you searched for only being present in the SERP. Think about the SERP listing as an advertisement! Everything you see in the SERP—the title, the URL, the description—all these things need to be useful, shareable, and value adding.

That’s not all, there are a lot of other things that can help improve your search engine listing. Things like Microdata markup (Schema.org), authorship markup, and video XML sitemaps can really help create that beautiful, informative listing which can entice the users to click on.

Branding

When we are creating these great SERP listings, we also need to take up the opportunity to showcase our brand. Think of it this way, when a user comes across your listing and finds it interesting, he goes ahead and clicks on it to visit the page. Let’s say he likes the page content, at this point he might make a note of your brand. In the future, he searches for something similar. Among the results that are in front of him if he sees your brand clearly mentioned in the SERP title or description, there is a high probability that he will click on your listing even if it is not ranking on #1.

Along with strong brand mentions, it would also help to use special symbols like ‘®’ (Copyright) and ‘™’ wherever applicable. This will help improve the authenticity of your content in the eyes of users as well as search engines.

Make it Shareable

So, let’s say you take care of all the above things and come up with great content. Don’t stop there, make it socially shareable. If people visit your webpage and really like what they see, give them the ability to share it with others. Lately we have been reading a lot how social is affecting the search and social signals being considered important by search engines, this is where it starts. Your content is not going to go viral if you don’t allow it.


Page Speed

In Google’s quest for superior user experience, it announced ‘Page Speed’ as a ranking factor. Even though this signal does not carry as much weight as relevancy, it is still something that cannot be neglected. This is when techniques like compression and responsive design come into picture. It all boils down to keeping the user happy. If your page takes a lot of time to load, could be because of heavy images or lack of server side compressing or 301 redirect chains, people are not going to stick around.

It has become imperative to ensure your page does not take a lot of time to load. Take out time to look at tools like:

These tools provide great insights in pinpointing the pain areas of your site with respect to page speed.

Conclusion

So, these things discussed above … are they the only things to do? Do we stop respecting the Classic form of SEO? NO! Lot of the classic stuff is really important. What needs to change is the way we do it and incorporate new stuff that has evolved in on-page front. Aspects such as usability, social, search—they are not individual entities anymore. These entities need to work together in an effort to make the Web a better place. (Source: www.searchenginejournal.com)

Wednesday 7 November 2012

Google Tag Manager


Google Tag Manager is also known as GTM, is a free container tag system from Google. A container tag helps you manage different kinds of tags that you may have on your site. This include web analytics tags, advertising conversion tags, general JavaScript, etc.

Users can add and update their own tags anytime. It’s not limited to Google-specific tags. It includes asynchronous tag loading, so “tags can fire faster without getting in each other’s way,” as Google puts it. It comes with tag templates for marketers to quickly add tags with Google’s interface, and supports custom tags. It also has error prevention tools like Preview Mode, a Debug Console, and Version History “to ensure new tags won’t break your site.”

Tuesday 6 November 2012

how to create robots file

Robots.txt file :

Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do.

Robots.txt file tells search engines which directories to crawl and which not to. You can use it to block crawlers from looking at your image directory if you don't want your images showing up on google search. Be careful not to use this to try and block people from directories you want to keep secret. Anyone can view you robots.txt file. Make sure you password protect directories that need to be secured.


The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.


Creating the robots.txt file

Robots.txt should be put in the top-level directory of your web server.

Take the following robots.txt file for example:


1) Here's a basic "robots.txt":
User-agent: *
Disallow: /
With the above declared, all robots (indicated by "*") are instructed to not index any of your pages (indicated by "/"). Most likely not what you want, but you get the idea.

2) Lets get a little more discriminatory now. While every webmaster loves Google, you may not want Google's Image bot crawling your site's images and making them searchable online, if just to save bandwidth. The below declaration will do the trick:
User-agent: Googlebot-Image
Disallow: /

3) The following disallows all search engines and robots from crawling select directories and pages:
User-agent: *
Disallow: /cgi-bin/
Disallow: /privatedir/
Disallow: /tutorials/blank.htm

4) You can conditionally target multiple robots in "robots.txt." Take a look at the below:
User-agent: *
Disallow: /
User-agent: Googlebot
Disallow: /cgi-bin/
Disallow: /privatedir/
This is interesting- here we declare that crawlers in general should not crawl any parts of our site, EXCEPT for Google, which is allowed to crawl the entire site apart from /cgi-bin/ and /privatedir/. So the rules of specificity apply, not inheritance.

5) There is a way to use Disallow: to essentially turn it into "Allow all", and that is by not entering a value after the semicolon(:):
User-agent: *
Disallow: /
User-agent: ia_archiver
Disallow:
Here I'm saying all crawlers should be prohibited from crawling our site, except for Alexa, which is allowed.

6) Finally, some crawlers now support an additional field called "Allow:", most notably, Google. As its name implies, "Allow:" lets you explicitly dictate what files/folders can be crawled. However, this field is currently not part of the "robots.txt" protocol, so my recommendation is to use it only if absolutely needed, as it might confuse some less intelligent crawlers.

Per Google's FAQs for webmasters, the below is the preferred way to disallow all crawlers from your site EXCEPT Google:
User-agent: *
Disallow: /
User-agent: Googlebot
Allow: /