Add Google Sign-In to Your Website Google Sign-In for Websites Enable Google Sign-In with only a few lines of code: <html lang = "...
Official Google Page Rank update milestone
Google PageRank (PR) Update in the year 2013 December 06th 2013 February 04th 2013 Google PageRank (PR) Update...
Hosting File dengan Google Code dan TortoiseSVN
Selama ini kita biasa menggunakan Google Code untuk menyimpan maksimal hanya tiga jenis berkas/file saja. Yaitu JavaScript, ZIP dan gambar....
Recent Posts Widget with Thumbnails for Blogger/Blogspot
A few days ago, I've posted a tutorial about How To Add A Simple Recent Posts Widget but today I want to present to you a very nice Rec...
Marketing Tutorials
31. Tr.im Enter Page Join Our Mailing System And Receive FREE Marketing Tools http://tr.im/enter 32. Topmarketings.com Marketing...
Google Search Appliance overview
Bring the magic of Google’s unique search quality, language handling and relevance to your users, delivering the simple yet powe...
Google Maps baru - Di Browser – Halo Dunia – Google Maps
Peta terlengkap, sekarang dibuat untuk Anda Google Maps baru menggambar peta yang disesuaikan bagi Anda untuk setiap pen...
Spider Simulator
helps blogging
helps blogging
helps blogging
Great Topics to Blog About, What to Blog About, Good Blog Topics for
Beginners, How to Use Google Blogger, Blogger Help Forum, Blogger Help
Desk, Blogger Help, Blogspot Help, content, color, search, page, com,
google, engine, background, http, border, post, www
Advanced Meta-Tags Generator
Advanced Meta-Tags Generator Tool � SEO Chat™
TitleThe Title Tag must contain no more than 70 characters (generally,
100 characters may be indexed).AuthorThe Author Tag is for the person
who wrote the material for the site.SubjectThe Subject Tag is for what
your site is. Business, music, hobby, cars. Use up to 100
characters.DescriptionThe Description Tag can have up to 150 characters
(generally, 200 to 250 characters may be indexed, though only a smaller
portion of this amount may be displayed).ClassificationThe
Classification tag is similar to description but more in
detail.KeywordsIn the Keyword tag use everything you think someone will
search for to find your site. Use 200
characters.GeographyWhere are you
located? Full AddressLanguageIs your site in English, Spanish,
French.....?Expires Use "never" unless your site will expire. (Eg. Tue,
18 Apr 2006 14:57:09 GMT | Note: Requires RFC1123 date as shown
hereCache ControlCache control level.NonePublicPrivateno-Cacheno-StoreNo
CacheThis directive indicates cached information should not be used and
instead requests should be forwarded to the origin server. Yes
NoCopyrightWho is the Owner of the site, Company NameZip CodeYour Zip
CodeCityYour City, TownCountryYour Country, use all names; USA, United
States, United States Of America, America, etc.DesignerWebmaster
namePublisherOwner, Webmaster, Company Names.Revisit-AfterTell search
engine how often this page updates. (Eg. 21 days | Note: Most search
engines do not support this Meta Tag)DistributionUse [Global] unless it
is a [Local] only
site.GlobalLocalInternal UseRobotsThe values ALL and
NONE set all directives on or off: [ALL=INDEX,FOLLOW] and
[NONE=NOINDEX,NOFOLLOW].AllNONEMS TagsDo you want Microsoft products to
automatically generate smart tags on your web pages. Yes NoEnter Captcha
To ContinueTo prevent spamming, please enter in the numbers and letters
in the box belowReport Problem with Tool.
Email ThisBlogThis!Share to TwitterShare to Facebook
Site Link Analyzer Tool � SEO Chat™
Site Link Analyzer Tool � SEO Chat™
URL
Valid URL
Type of links to return:
External (links going to outside web-sites) Internal (links inside the
current web-site) Both types
Additional Info
Show nofollow links?
Enter Captcha To Continue
To prevent spamming, please enter in the numbers and letters in the box
below
Report Problem with Tool.
Email ThisBlogThis!Share to TwitterShare to Facebook
How to Protect Your Content
Your content is your most precious
asset. It takes efforts and time not only to create it but also to
protect it from thieves. Unfortunately, content theft is way too
common and there is hardly a site that hasn't been affected.
On the other hand, as my experience
shows, when there are thousands of articles to be looked after, this
process takes too much time and in practice makes sense to do only
for important articles. Anyway, you can't let thieves go with your
content – you must know what to do when you encounter theft. Here
are the steps how to protect your content.
Why Your Content Is Your Most Important Asset
Unless you are new to SEO, the answer to this question is obvious.
As we say, “Content Is King”. You need original content in order
to rank well in Google and this is why you chunk articles, images,
videos, etc. to publish on your site. However, it is so easy to
copy+paste content and this is why content theft is so common.
If it weren't bad enough that you feed somebody else's site for
free, the duplicate
content penalty adds insult to injury. Google is trying really
hard to deal with duplicates but it is way too common to see stolen
articles rank higher than your original. This is why it is so
important to protect your content in any way you can.
Place Copyright Notices and Watermarks
As naïve as it might sound, sometimes thieves aren't aware they
are stealing. There are many articles, images, videos, etc. in the
public domain that are free to use even commercially. In order to
avoid confusion your content isn't in the public domain, be sure to
place a Copyright notice in the footer of your site, or even better –
under the copyrighted piece of content itself.
It also makes sense to add physical barriers to theft. For
instance, you can add watermarks for images and videos – these
aren't 100% secure but they will stop some of the thieves because
with your watermark it will be awkward to use the stuff elsewhere.
For articles, you may want to disable text selection. This will
make it harder to copy content directly and will stop many thieves
because now copying your content involves more efforts.
Unfortunately, there are other ways to copy your content (though they
do require more effort), so if somebody really wants your content,
disabled copying won't stop them but it's more than nothing.
Use Google Authorship to Guard Your Content
Google Authorship
is a very useful tool when it comes to content protection and
building your online reputation. Basically, the idea is simple –
you enter your online stuff and claim authorship about it.
The only issue is that you must use your real name – this is a
problem, if you write under a pseudonym, or ghost write, or simply
don't want to disclose your authorship because of privacy concerns.
If your site has multiple writers, you still can use Google
Authorship but each of them must claim his or her articles
separately.
Once your content is entered in Google Authorship, Google knows
it's you who created it, so even if it gets copied somewhere else,
you won't get the duplicate content punishment.
Set Google Alerts to Watch for Copied Content
Protecting your content from theft is one thing, catching thieves
is another. Even if you do a good job in guarding your content, there
will always be thieves. The easiest way to catch them is with the
help of Google Alerts.
Google Alerts is another useful service from Google. Without going
into too much detail, the logic is this:
You copy sentences from your text and create alerts to be notified
when they appear online. You need to make them a direct match (i.e.
use quotations), so that when your words get discovered somewhere
else, you get an alert. It's best if you create 2 or 3 alerts per
article – one for the first paragraph and some more from random
places in the text.
Your first paragraph might be copied more – for instance as an
intro to your article, followed by a link to your site. This isn't
theft but you still might want to be aware of it. Also, if your
content is published and you are quoted as the author, this
technically isn't theft either, though you certainly might not like
it.
Steps to Take to Deal with the Theft
After you get an alert and discover that content of yours has been
stolen, here is what you can do next.
Prepare Your Evidence
The first step is to gather your evidence. This means to make
screenshots and prepare the original files. Of course, it's hard to
prove you were the first to publish this particular piece because
having the drafts for an article doesn't mean much – they could
have been created afterwards in attempt to frame the original author.
For images and videos, if you have the source files, this could be
more of a proof.
If your content is indexed in Google and it has a date (and of
course this date is prior to the date the copy was indexed), you can
use this as well as an evidence the content was stolen from you, not
vice versa.
Contact the Thief (and Their Host, If Necessary)
After you have your evidence, now it's time to take real steps.
You might be tempted to but don't start biting right away. First,
send a friendly email to the infringing party. Even if the
probability isn't high, it's possible the theft wasn't on
deliberately. It's possible that after your friendly email the blog
owner removes your content and the problem is solved.
If the friendly email to the blog owner doesn't help, contact
their hosting provider. Attach the evidence you have and if the
infringement is blatant, it's quite possible their hosting provider
might even close their account, if they don't want to remove the
stolen content on their own.
File a DMCA Complaint
Very often the steps in the previous sections suffice to deal with
thieves but if they don't do the job for you, you will have to use
the heavy artillery – i.e. file a DMCA complaint with Google.
You submit a DMCA
(Digital Millennium Copyright Act) complaint with Google to tell them
to deindex content stolen from you. In this case 'stolen' means used
without your permission or without crediting you. Google is usually
quick in removing stolen content, so you can expect that shortly
after you submit the complaint, it will be removed from Google's
index.
Dealing with content theft is very time-consuming but if you want
to protect your rights (and your SEO rankings), you need to do it.
It's a never ending battle but with the right tools, as described in
this article, your chance of success is good. source:
http://www.webconfs.com/how-to-protect-your-content-article-54.php
Labels:
Copying,
Digital Millennium Copyright Act,
Duplicate Content,
Google,
Google Alert,
Google Authorship,
Search engine optimization,
Theft
Email ThisBlogThis!Share to TwitterShare to Facebook
Similar Page Checker
Similar Page Checker
Enter First URL
Enter Second URL
How it Works
Search Engines are known to act upon websites that contain Duplicate /
Similar content.
Your content could be similar to other websites on the Internet,
or pages from within your own website could be similar to each other
(usually the case with dynamic product catalog pages).
This tool allows you to determine the percentage of similarity between
two pages.
The exact percentage of similarity after with a search engine may
penalize you is not known,
it varies from search engine to search engine,
Your aim should be to keep your page similarity as LOW as possible.
Duplicate Content Filter
This article will help you understand why you might be caught in the
filter, and ways to avoid it.
Duplicate Content Filter: What it is and how it works
Duplicate Content has become a huge topic of discussion lately, thanks
to the new filters that search engines have implemented. This article
will help you understand why you might be caught in the filter, and ways
to avoid it. We'll also show you how you can determine if your pages
have duplicate content, and what to do to fix it.
Search engine spam is any deceitful attempts to deliberately trick the
search engine into returning inappropriate, redundant, or poor-quality
search results. Many times this behavior is seen in pages that are
exact replicas of other pages which are created to receive better
results in the search engine. Many people assume that creating multiple
or similar copies of the same page will either increase their chances
of getting listed in search engines or help them get multiple listings,
due to the presence of more keywords.
In order to make a search more relevant to a user, search engines use a
filter that removes the duplicate content pages from the search results,
and the spam along with it. Unfortunately, good, hardworking
webmasters have fallen prey to the filters imposed by the search engines
that remove duplicate content. It is those webmasters who unknowingly
spam the search engines, when there are some things they can do to avoid
being filtered out. In order for you to truly understand the concepts
you can implement to avoid the duplicate content filter, you need to
know how this filter works.
First, we must understand that the term "duplicate content penalty" is
actually a misnomer. When we refer to penalties in search engine
rankings, we are actually talking about points that are deducted from a
page in order to come to an overall relevancy score. But in reality,
duplicate content pages are not penalized.
Rather they are simply filtered, the way you would use a sieve to remove
unwanted particles. Sometimes, "good particles" are accidentally
filtered out.
Knowing the difference between the filter and the penalty, you can now
understand how a search engine determines what duplicate content is.
There are basically four types of duplicate content that are filtered
out:
Websites with Identical Pages - These pages are considered
duplicate, as well as websites that are identical to another website on
the Internet are also considered to be spam. Affiliate sites with the
same look and feel which contain identical content, for example, are
especially vulnerable to a duplicate content filter. Another example
would be a website with doorway pages. Many times, these doorways are
skewed versions of landing pages. However, these landing pages are
identical to other landing pages. Generally, doorway pages are intended
to be used to spam the search engines in order to manipulate search
engine results.
Scraped Content - Scraped content is taking content from a web
site and repackaging it to make it look different, but in essence it is
nothing more than a duplicate page. With the popularity of blogs on the
internet and the syndication of those blogs, scraping is becoming more
of a problem for search engines.
E-Commerce Product Descriptions - Many eCommerce sites out there
use the manufacturer's descriptions for the products, which hundreds or
thousands of other eCommerce stores in the same competitive markets are
using too. This duplicate content, while harder to spot, is still
considered spam.
Distribution of Articles - If you publish an article, and it gets
copied and put all over the Internet, this is good, right? Not
necessarily for all the sites that feature the same article. This type
of duplicate content can be tricky, because even though Yahoo and MSN
determine the source of the original article and deems it most relevant
in search results, other search engines like Google may not, according
to some experts.
So, how does a search engine's duplicate content filter work?
Essentially, when a search engine robot crawls a website, it reads the
pages, and stores the information in its database. Then, it compares
its findings to other information it has in its database. Depending
upon a few factors, such as the overall relevancy score of a website, it
then determines which are duplicate content, and then filters out the
pages or the websites that qualify as spam. Unfortunately, if your
pages are not spam, but have enough similar content, they may still be
regarded as spam.
There are several things you can do to avoid the duplicate content
filter. First, you must be able to check your pages for duplicate
content. Using our Similar Page Checker,
you will be able to determine similarity between two pages and make
them as unique as possible. By entering the URLs of two pages, this
tool will compare those pages, and point out how they are similar so
that you can make them unique.
Since you need to know which sites might have copied your site or pages,
you will need some help. We recommend using a tool that searches for
copies of your page on the Internet: www.copyscape.com.
Here, you can put in your web page URL to find replicas of your page
on the Internet. This can help you create unique content, or even
address the issue of someone "borrowing" your content without your
permission.
Let's look at the issue regarding some search engines possibly not
considering the source of the original content from distributed
articles. Remember, some search engines, like Google, use link
popularity to determine the most relevant results. Continue to build
your link popularity, while using tools like www.copyscape.com
to find how many other sites have the same article, and if allowed by
the author, you may be able to alter the article as to make the content
unique.
If you use distributed articles for your content, consider how relevant
the article is to your overall web page and then to the site as a whole.
Sometimes, simply adding your own commentary to the articles can be
enough to avoid the duplicate content filter; the Similar Page Checker
could help you make your content unique. Further, the more relevant
articles you can add to compliment the first article, the better.
Search engines look at the entire web page and its relationship to the
whole site, so as long as you aren't exactly copying someone's pages,
you should be fine.
If you have an eCommerce site, you should write original descriptions
for your products. This can be hard to do if you have many products,
but it really is necessary if you wish to avoid the duplicate content
filter. Here's another example why using the Similar Page Checker
is a great idea. It can tell you how you can change your descriptions
so as to have unique and original content for your site. This also
works well for scraped content also. Many scraped content sites offer
news. With the Similar Page Checker, you can easily determine where the
news content is similar, and then change it to make it unique.
Do not rely on an affiliate site which is identical to other sites or
create identical doorway pages. These types of behaviors are not only
filtered out immediately as spam, but there is generally no comparison
of the page to the site as a whole if another site or page is found as
duplicate, and get your entire site in trouble.
The duplicate content filter is sometimes hard on sites that don't
intend to spam the search engines. But it is ultimately up to you to
help the search engines determine that your site is as unique as
possible. By using the tools in this article to eliminate as much
duplicate content as you can, you'll help keep your site original and
fresh.
Labels:
Duplicate Content,
Google,
Search,
Search engine optimization,
Uniform resource locator,
Web crawler,
Web Design and Development,
Web search engine
Email ThisBlogThis!Share to TwitterShare to Facebook
Search Engine Spider Simulator
Search Engine Spider Simulator
Enter URL to Spider
How it Works
A lot of Content and Links displayed on a webpage may not actually
be visible to the Search Engines, eg. Flash based content,
content generated through javascript,
content displayed as images etc.
This tool Simulates a Search Engine by displaying the contents
of a webpage exactly how a Search Engine would see it.
It also displays the hyperlinks that will be followed (crawled)
by a Search Engine when it visits the particular webpage.
See Your Site With the Eyes of a Spider
The article explains how Search Engines view a Webpage.
See Your Site With the Eyes of a Spider
Making efforts to optimize a site is great but what counts is how
search engines see your efforts. While even the most careful
optimization does not guarantee tops position in search results, if
your site does not follow basic search engine optimisation truths, then
it is more than
certain that this site will not score well with search engines. One
way to check in advance how your SEO efforts are seen by search
engines is to use a search
engine simulator.
Spiders Explained
Basically all search engine spiders function on the same principle
– they crawl the Web and index pages, which are stored in a
database and later use various algorithms to determine page ranking,
relevancy, etc of the collected pages. While the algorithms of
calculating ranking and relevancy widely differ among search engines,
the way they index sites is more or less uniform and it is very
important that you know what spiders are interested in and what they
neglect.
Search engine spiders are robots and they do not read your pages
the way a human does. Instead, they tend to see only particular stuff
and are blind for many extras (Flash, JavaScript) that are intended
for humans. Since spiders determine if humans will find your site, it
is worth to consider what spiders like and what don't.
Flash, JavaScript, Image Text or Frames?!
Flash, JavaScript and image text are NOT visible to search
engines. Frames are a real disaster in terms of SEO ranking. All of
them might be great in terms of design and usability but for search
engines they are absolutely wrong. An incredible mistake one can make
is to have a Flash intro page (frames or no frames, this will hardly
make the situation worse) with the keywords buried in the animation.
Check with the Search
Engine Spider Simulator tool a page with Flash and images (and
preferably no text or inbound or outbound hyperlinks) and you will
see that to search engines this page appears almost blank.
Running your site through this simulator will show you more than
the fact that Flash and JavaScript are not SEO favorites. In a way,
spiders are like text browsers and they don't see anything that is
not a piece of text. So having an image with text in it means nothing
to a spider and it will ignore it. A workaround (recommended as a SEO
best practice) is to include meaningful description of the image in
the ALT attribute of the tag but be careful not to use
too many keywords in it because you risk penalties for keyword
stuffing. ALT attribute is especially essential, when you use links
rather than text for links. You can use ALT text for describing what
a Flash movie is about but again, be careful not to trespass the line
between optimization and over-optimization.
Are Your Hyperlinks Spiderable?
The search engine spider simulator can be of great help when
trying to figure out if the hyperlinks lead to the right place. For
instance, link exchange websites often put fake links to your site
with _javascript (using mouse over events and stuff to make the link
look genuine) but actually this is not a link that search engines
will see and follow. Since the spider simulator would not display
such links, you'll know that something with the link is wrong.
It is highly recommended to use the
Please click this link to Try Spider simulator for you site.
Spidered Text : helps blogging helps blogging helps blogging Great Topics to Blog About, What to Blog About, Good Blog Topics for Beginners,...
Advanced Meta-Tags Generator
|