Two Ways to Track Profit from Online Shoppers

For internet marketers, they would like to know how to track a result from advertising online. You cannot rely on the statistics of web traffic only because there is a reliable survey report saying that more than 80% of online lookers will search for product information first in WWW, then make a decision later and prefer to purchase offline.

How do you track if the online advertising investment is cost-effective?

I would like to recommend two ways which will help you track your profit.

Firstly, have you ever get a discount coupon when you go shopping or when you buy a product? Do you think if it can urge you to buy the product with a discounted price next time or if it can encourage you to try a new product with special price? Yes or No. I think most will say ‘Yes’.

This is what I am suggesting you to use the idea of using ‘coupon’ to make your online promotions through e-mail marketing or even search engine marketing. It is not a printed coupon but a ‘printable’ coupon. This interactive marketing will help you monetize traffic, establish a traceable action and engage online shoppers who like to shop offline.

As printable coupons will be printed on demand by customers, it shows their intention to purchase your product as a coupon redemption. Plus, if the offer is enticing, of course, this will spread to those whom they know. Their friends will be invited to print the promotional coupon.

To achieve the goals, you should set a limit on the number of online printable coupons which will be distributed by individual participants in a promotion to control a part of marketing budgets. Using a toll-free number for an online promotion is the second way to follow the result of online advertising.

As to the offline behavior of online consumers, put your specific contact number in your advertising banner will help you to track the efficiency of your ad because when they see your ad and are interested in what you present along with the toll-free number which is an alternative channel for them to contact you with free of charge and they give you a ring, all calls you receive from that toll-free number represent the opportunity of closing sales and that your ad is working efficiently.

Try online coupons and toll-free number to improve your goal conversions from advertising online and you will gain a profit from online consumers in the real world sales.

Quick Way to SEO Your Site in Less Than 60 Minutes

Matt McGee has a very useful information to optimize your site with his steps in less than one hour.

SEO Your Site in Less Than an Hour

A. Visit the home page, www.domain.com.

1.Does it redirect to some other URL? If so, that’s bad.
2.Review site navigation:
  • > Format — text or image? image map? javascript? drop-downs? Text is best.
  • > Page URLs — look at URL structure, path names, file names. How long are URLs? How far away from the root are they? Are they separated by dashes or underscores?
  • > Are keywords used appropriately in text links or image alt tags?

3.Review home page content:

  • > Adequate and appropriate amount of text?
  • > Appropriate keyword usage?
  • > Is there a sitemap?
  • > Do a “command-A” to find any hidden text.
  • > Check PageRank via SearchStatus plugin for Firefox

4.View source code:

  • > Check meta description (length, keyword usage, relevance).
  • > Check meta keywords (relevance, stuffing).
  • > Look for anything unusual/spammy (keywords in noscript, H1s in javascript, etc.).
  • > If javascript or drop-down navigation, make sure it’s crawlable.
  • > Sometimes cut-and-paste code into Dreamweaver to get better look at code-to-page relationship.
B. Analyze robots.txt file. See what’s being blocked and what’s not. Make sure it’s written correctly.

C. Check for www and non-www domains — i.e., canonicalization issues. Only one should resolve; the other should redirect.

D. Look at the sitemap (if one exists).
  • 1.Check keyword usage in anchor text.
  • 2.How many links?
  • 3.Are all important (category, sub-category, etc.) pages listed?

E. Visit two category/1st-level pages.
Repeat A1, A2, A3, and A4 - this will be quicker since many objects (header, footer, menus) will be the same. In particular, look for unique page text, unique meta tags, correct use of H1s, H2s to structure content.

Check for appropriate PageRank flow. Also look at how they link back to home page. Is index.html or default.php appended on link? Shouldn’t be.

F. Visit two product/2nd-level pages.
Same steps as E.
Also, if the site sells common products, find 2-3 other sites selling same exact items and compare product pages. Are all sites using the same product descriptions? Unique content is best.

G. Do a site:domain.com search in all 3 main engines.
Compare pages indexed between the three. Is pages indexed unusually high or low based on what you saw in the site map and site navigation? This may help identify crawlability issues. Is one engine showing substantially more or less pages than the others? Double-check robots.txt file if needed.

H. Do site:domain.com *** -jdkhfdj search in Google to see supplemental pages.
All sites will have some pages in the supplemental index. Compare this number with overall number of pages indexed. A very high percentage of pages in the supplemental index = not good.

I. Use Aaron’s SEO for Firefox extension to look at link counts in Yahoo and MSN. If not in a rush, do the actual link count searches manually on Yahoo Site Explorer and MSN to confirm.

-----------------------------------------------------------------------------------------------

Besides, I would like to add further some nice comments by Paul for the SEO:
1) constant check of the links: domain.com to see what links rank higher than others so that you can assess which site is better for promotion purposes
2) using the good Page Strength from http://www.seomoz.org/page-strength
3) having both the sitemap you mention and an XML sitemap and
4) getting (in 1-2 minutes) the Google and Yahoo Keycodes.

Keep it as your handy checklist... :) for SEO your site quickly.

What You Don't Know About Your Web Site Can Hurt You

Here is an interesting article written by Christine Churchill, President of KeyRelevance.com, a full service search engine marketing firm. You will know better your web site from this article.

----------------------------------------------------------------------------------------------

It's tough to be a small business in today's fast paced world. Small businesses not only have to know their core industry inside out, but now they have the additional burden of being proficient in online marketing. Since many small businesses have limited staff, most people within these companies wear multiple hats, from CEO to webmaster. Unfortunately, this often means the person responsible for the web site knows very little about it. Everything may seem to run flawlessly for a time, but then, when something goes wrong, they are left scrambling for help and at the mercy of their Internet service provider (ISP).

This is the first of two articles designed to raise the awareness of small business owners. Take heed: little details associated with your web site that are ignored can have a negative affect on your site and your business. Some problems affect the business mechanics; others affect your search positioning. The good news is, most are easy to fix once you're aware of them. Being informed of potential challenges provides you with the opportunity to prepare and avoid serious consequences down the road.

If you wear the webmaster hat, you'll want to carefully review the following list for some helpful information and tips. The more you know, the less time and money you'll waste later on.

1. Your domain name is about to expire, and you don't know it
Every domain name has at least three contacts associated with it: administrative, technical and registrant. When the domain name is about to expire, renewal notices are sent multiple times. Unfortunately, in many cases, the person whose email is on the account no longer works at your company. The notices are still sent, but since that email address is no longer valid, they go unread. What happens? Your domain name expires and your site "mysteriously" goes offline.

Your domain name is extremely important and worth protecting. Don't assume everything is fine. Do a WHOIS search and find out the details on your account. You may discover the information is wrong, out of date, or not what you expected. More than one company has been shocked to find they didn't actually own their domain name... and now they have to buy it.

If you are the rightful owner and your ownership laspes, you have a grace period of 30 - 60 days to renew (depending on the registrar). After that time, the domain name becomes available for anyone to purchase. Recapturing your domain name after it has been released and purchased can be an expensive process that often involves lawyers. The secondary domain market is a booming business. The players know the value of established domain names and fully intend to take advantage of them. Avoid this heartache by checking your company's domain name status today and then registering it for a long period.

In an article entitled "How to Protect Your Domain Name," fellow Small Is Beautiful columnist Matt MaGee tells a true story of his experience helping a small business owner who lost his domain name.

2. Your robots.txt file has banished search engines from your site
This is one of those invisible problems that can kill your site with regard to rankings. To make matters worse, it can go on for months without anyone knowing there is a problem. I don't want to sound like a doomsayer, but don't assume your company is immune to this problem. We've even seen it happen to widely known and publicly traded businesses with a dedicated staff of IT experts.

There are numerous ways to accidentally alter your robots.txt file. Most often it occurs after a site update when the IT department rolls up files from a staging server to a live server. In these instances, the robots.txt file from the staging server is accidentally included in the upload. (A staging server is a separate server where new or revised web pages are tested prior to uploading to the live server. This server is generally excluded from search engine indexing on purpose to avoid duplicate content issues.)

If your robots.txt excludes your site from being indexed, your site will drop from the engines' databases. You may think you did something wrong that got your site penalized or banned, but it's actually your robots.txt file telling the engines to go away.

How do you tell what's in your robots.txt file? The easiest way to view your robots.txt is to go to a browser and type your domain name followed by a slash then "robots.txt." It will look something like this in the address bar: http://www.mydomainname.com/robots.txt.

If you get a 404-error page, don't panic. The robots.txt file is actually an optional file. It is recommended by most engines but not required.

You have a problem if your robots.txt file says:

User-agent: *
Disallow: /

A robots.txt file that contains the above text is excluding ALL robots - including search engine robots - from indexing the ENTIRE site. If you have certain sections you don't want indexed by the engines (such as an advertising section or your log files), you can selectively disallow them. A robots.txt that disallows the ads and logs directories would be written like this:

User-agent: *
Disallow: /ads
Disallow: /logs

The disallow shown above only keeps the robots from indexing the directories listed. Some webmasters falsely think that if they disallow a directory in the robots.txt file that it protects the area from prying eyes. The robots.txt file only tells robots what to do, not people (and the standard is voluntary so only "polite" robots follow it). If certain files are confidential and you don't want them seen by other people or competitors, they should be password protected.

At SES New York 2007, Danny Sullivan hosted a robots.txt summit where search engine representatives talked about the frequent misuse of the file and how webmasters accidentally excluded their sites from indexing. To learn more about the robots.txt file see http://www.robotstxt.org/.

Here's something good to know: If you are using Google Webmaster Tools, Google will indicate which URLs are being restricted from indexing.

3. Your site is scaring your customers with expired SSL certificate notices
If you're a small business conducting ecommerce, you're probably familiar with Secure Sockets Layer (SSL) Certificates. These certificates enable encryption of sensitive information during online transactions. When the certificate is up to date the technology protects your web site and lets customers know they can trust you. Sadly, many times the person who originally set up the certificate moves on. Because their email no longer works, the renewal notices fall to the side. So you plod along unaware of the lurking danger. Sales plummet and no one can determine why.

Finally, someone notices the "scary security messages" that appear when someone starts the checkout process. If you're lucky, a customer will call and tell you about the problem. If you're smart, you'll have an employee periodically verify that your checkout process and SSL certificate are working properly.

To check your SSL certificate, visit a secure page on your site then double click on the padlock icon in the bottom right corner of your browser. A window will pop up showing the SSL certification details including the expiration date. If the certificate is set to expire in less than 2-3 weeks, you should begin working with your IT department or ISP to get the certificate renewed.

4. Your content management system (CMS) is limiting your search engine success
Search engine optimizers have a love-hate relationship with CMS. The CMS can make adding content to a site easy for the non-programmer, but often times the system is hostile toward search engines. A CMS that doesn't allow unique titles, META tags, breadcrumbs, unique alt attributes, and other on-page optimization techniques can limit a site's success. For more details, I highly recommend you read an article by my colleague, Stephan Spencer, on search-friendly content management systems.

5. When you changed domain names, your redirects were set up improperly
Google and other search engines will treat various types of redirects differently. To ensure that the current domain inherits all the link equity the old domain has earned, verify that your site utilizes "301 permanent" redirects rather than "302 temporary" redirects. These numbers are codes that your web server sends to browsers and search engine spiders telling them how to handle the web page. If your server tells the search engine spiders that the new location is only temporary, the search engines will ignore the redirection and not transfer the existing link equity to the new site.

To properly implement this, you need to ensure that every page of the old site is properly redirected to the corresponding new page. If the domain name changed, but the site architecture did not, then simply redirecting the old domain to the new is sufficient. If the page URLs changed as a part of a larger redesign, insure that every page in the old site is properly (301) redirected to a page on the new domain.

Lisa Barone over on Bruce Clay's blog wrote an excellent "non-scary" article on how to set up a 301 redirect that is easy for even the non-techie to follow. And Aaron Wall at SEObook has a detailed 301 case study on how well the different engines recognized and followed 301s on his site.

6. Your site is sharing an IP address with a spamming site
Many small businesses choose to use a virtual or shared hosting service rather than purchasing their own server. This arrangement is usually less expensive than dedicated hosting and meets the needs of the small business. In many cases a virtual hosting arrangement is fine, but keep in mind that the search engines pay attention to who your neighbors are on that shared server.

Some sites have the unfortunate luck of being placed on a server with sites using known spam techniques. Since your site is on the same IP address as the spammy guys your site may be unjustly penalized by the actions of other sites. In other words, you might be a victim of guilt by association.

Even if you are running your own dedicated server, there is a small chance you'll face a similar issue. Dedicated servers are grouped into something called "Class C IP blocks." Basically, all the IP addresses are the same except for the last number. Frequently, all the sites in these situations are owned by the same company so, in essence, while your site might be legit, there may be 253 other servers out there besmirching your site's good name.

If you are concerned about being in a bad hosting environment, ask your ISP for the names of the other sites being hosted on your IP address (in the case of shared hosting, more than one site may be served from the same IP address). Also ask them for the domain names of the other sites that differ only in the last number of their IP address.

7. You've got the overloaded server blues
Does your site take forever to load? If your page file size is reasonable and you have a fast browser connection, the problem may not be with your site, but with the server at the hosting company.

The hosting company may have too many sites hosted on one server. They may also have you on a server with a site that is extremely active and monopolizes the server resources. The overload can result in your server timing out when a request is made from a spider. If this condition is chronic, it could result in the engine thinking your site is down. That could result in your site being dropped from the index.

Another problem with a slow loading site is that it can cost you business. Most web surfers are impatient. If they don't see your site loading within a few seconds, they leave. They don't care what the cause of the slow loading is; they simply move on.

If your ISP provides a service level agreement (SLA) regarding performance, uptime, etc. you are likely OK. Any provider that offers such a guarantee will have implemented procedures that would make triggering those SLA thresholds unlikely. If your site is consistently sluggish, however, request that your site be moved to a new server. Note that this will cause some hiccups because your IP address will change, so make sure this is the actual source of the sluggish performance before requesting the switch. Consider moving to a higher class of service or a dedicated server. If your web site is a core part of your business, pay the marginal costs needed to improve the service.

8. Your site is broken on Firefox
During the "browser wars" of the late 1990s, it was important to check your site under multiple browsers (including browsers for Macs and Unix) because many times a site would "break" or render oddly under different browsers. As Internet Explorer (IE) achieved dominance, many IE-centric web designers thought of browser compatibility as an issue of the past because IE was very forgiving. IE would properly display even sloppily coded sites

With the enthusiastic spread of the Firefox browser, the compatibility issue has reared its head again. Firefox is a more W3C-standards compliant browser so sites that look great under IE sometimes break under Firefox. Pages that use proprietary tags that only work under one browser (usually IE) or pages that contain syntax errors (especially unclosed tags or strange nesting) can cause a web page to render poorly in Firefox as well as Opera or other standards-compliant browsers.

In June 2007 a OneStat study on browser use reported that Firefox commanded a 19.65% share of the US browser market and 12.72% globally. If you're in a high-tech industry, your percentage of visitors using Firefox may be even higher.

In parts of Europe, adoption of the Firefox browser is even higher, especially in Germany where the share is over 26% (that's better than 1 in 4 visitors!). The browser wars get even more interesting when you consider that the most widespread browser in China is Maxthon, a browser of which most Westerners have never heard.

What this means to the small business webmaster is that you can't ignore browser compatibility any more, or you may be giving 20% of your visitors a bad experience.

----------------------------------------------------------------------------------------------