Senin

Optimization and Positioning (SEO)


Positioning and optimization service (SEO - called Search Engine Optimization) - all measures taken to improve the position of a website in the search results in search engines.



Purpose and application




The purpose of positioning the web site is to make it such properties
and parameters which facilitate the process of indexing by robots, and
optimally meet the requirements of the search algorithm, so as to
increase the priority of the search results in the formation of the
respective queries.

Optimize the content and code is the first and fundamental step in the
process of positioning page, so its use is the same as the goal of
positioning itself - the introduction of the service on pre-established
positions in the organic search results (Top 1, Top 3, Top 5, Top 10,
etc.).




Types of positioning


Positioning and optimization service precedes the mandatory audit of
the service, during which recommendations are developed for optimizing
the content and code, and how it is implemented.
The tasks raised by the audit can be carried out through activities on-site and off-site.



The activities of the party (on-site):


  • Distribution of key phrases, according to the subjects on each of its sides.


  • Rewriting old, or write a new content while maintaining the necessary
    parameters (uniqueness and volume of text, density and location of
    keywords, links, text structuring, distribution of text titles and
    subtitles, etc.).


  • Compilation of meta tags, title (for each party in accordance with the
    key phrases that are positioned for it) and description (short
    description of the content).

  • Correction of the source code and improve its errors.


  • Optimize service project for the speed of loading the page in your
    browser, read them through different browsers, the adequacy of
    performance style to the subject of the page, select the font, assigning
    alt tags to images, placed on the site, etc.

  • Optimising usability: page navigation should be most comfortable for the user. The organization of the overall structure and rules of navigation for the entire site.

  • Linking the internal pages of the site: to create a structure of cross-links.



Activities outside the party (off-site):


  • Adding the service URL to search engines, directories, assign it, if necessary, to the region.

  • Obtaining external inbound links (backlinks) distributed on other sites with appropriate subject. Correcting their existing incoming links: removing unnecessary links from low quality sites and broken.


  • Arrangement of entries, articles, posts, opinions, etc. with a link to
    other sites in order to increase the visibility of the site.

  • The elimination of potential causes that may lead to the imposition of sanctions by the search engines.



Separation methods of positioning by Color

All the actions taken by the positioners to increase the page rank in search engine rankings can be divided into three groups:




  • White positioning method (White Hat SEO)
    This includes efforts to increase site popularity among users
    interentu, providing them with interesting, original and unique content
    with a minimum density of keywords, increasing the number of naturally
    acquired links (for example, by adding to the theme directory) .

  • Gray positioning method (Grey Hat SEO)
    SEO is the creation of Grey content with a high density of keywords,
    doorway pages, creating websites zapleczowych so-called "zapleczówek",
    which are then linked with pozycjonowanymi sites.

  • Black positioning method (Black Hat SEO) allowed the wide use of doorway pages, content that is only visible to search engine spiders, so-called. Cloaking.
    Services that have been caught on the use of such methods are banished
    by search engines (removal of the search engine's index).


Optimize text content


Optimization of textual content - text adapted to the requirements of the search engines in order to increase the page rank in search results.
This is an extremely important part of the process of positioning, for
it is based on the content side is determined compliance with the
request.
The text optimization is a complex set of activities, which consists of several steps:






  • Introduction to text keywords - each text is adapted to specific
    inquiries, in response to which the page should appear in the search
    results.
    Enter keywords in the text must, however sparingly, otherwise the search engines may not accept such a website.


  • Check the uniqueness of the text - the text written well is to use a
    service provider checks the uniqueness of the text and if the text is
    not quite unique, fine-tune it from this angle.

  • Readability of text - following the above procedures, you should check that the resulting text is easy to read. If readability is not the best, you must answer the question of what is written.
    If it was created especially for robots and probably no one will read
    it, it can be left unchanged, but if the text contains important
    information that you need to hit the visitors to the site, it should be
    reworded so that it is the most readable, even at the cost of delay
    effects of positioning.

    If you are the owner of the lack of time to create the text, the text
    can be ordered with copywriters or purchase it on the stock exchange
    texts.


PageRank (PR)


PageRank (PR)
- expressed an integer between 1 and 10, and the weighting of the
search engine Google, calculated on the basis of the quantity and
quality of links that point to a page, both external and internal.

Calculation of PR with relatively high accuracy can be calculated by
the PR formula, derived from algorithms and formulas presented in the article, the creators of Google, Sergey Brin and Larry Page: PR (A) = (1 - d) + {PR (T1) / C (T1) + ...
+ PR (Tn) / C (Tn)} d (let's call it a formula No. 1).




Before explaining the symbols in the formula, it must be assumed that
many of the values ​​and symbols used by Google to determine PR hand,
are proprietary business secret.

Therefore, the following explanations are included only putative
definitions developed by the community positioners by experiments.



Because of the large number of sites on the Internet, as defined in
absolute PR is a convenient tool for assessing the significance of the
page (such an assessment is required, for example, in taking decisions
about placement of links on your platform).
Definitely more convenient in this case is to use the Google Toolbar PageRank. It is a special plugin for your browser, showing the importance of hand expressed in whole numbers between 1 and 10 TLPR is calculated according to the formula:

TLPR * a = logbasePR


The exact value of the logarithm of the base, which is dependent on the
number of pages on the Internet and its calculation formula is the
secret of Google.
Nevertheless, in view of the observations, it can be assumed that this is a value close to the number 7 In the same way for the coefficient a, located between 0 and 1, select a value first On this basis, with a fairly high degree of accuracy, the Google Toolbar PR can be calculated by this method:

TLPRlog7 (PR)

But keep in mind that Google's page ranking process uses a real PR. TLPR is created solely for the convenience of sensors.

Increase PR by the internal linking





  • d - the so-called damping factor, reflecting passed by the links the
    linked website (for which a calculated PR) "quantity" of its validity.

    The exact amount of this factor is not disclosed, but the observations
    show that with a high degree of accuracy can be assumed that it is 0.85
    (i.e. 85% of the validity of the party).

    According to other data, the attenuation factor measures how likely is
    the transition from the links on the links on the deployed link.
    Also in this case for the value of d shall be 0.85.

  • n - the number of pages that are located on links on this page for which is calculated PR. - C - The total amount of external links located on the side of the link.

  • T (1 to n) - number of pages maintainable. ToolBar PageRank (PageRank Toolbar)



The formula for calculating PageRank

Starting with page ranking formula (formula No. 1), we can conclude that the minimum PR website may be negative or equal to 0 If we assume that d = 0.85, a 1-d = 0.15. Thus is the conclusion that PRmin = 0.15 (the sum of the bracket in the formula No. 1 is 0).
In this way, even for the new service with quite a lot of pages and no
external links, thanks to skillful internal linking (each page has a
link to a specific page), you can get very high PR.


Example:


the number of pages of approximately 15 000, with the mandated 10
internal links leading to the home page on every page, you can get:


PR homepage = 0.15 + 15000/10 * 0.85 = 1275.15 TLPR homepage = log71275, 15 3.6 4


The importance of PR only in the early days of search engine Google PR
was the main factor influencing the ranking of sites in search results.
Currently, these factors are approximately 200 However, to this day among the main parameters affecting the results, PR is one of the leading positions.

Subdomain


Subdomain (also called sub-domain) - is part of a higher level domain. For example, if your domain name is domena.pl is the subdomain name will look like this: poddomena.domena.pl. Most hosting companies provide free subdomain registration service. You can also move to a different subdomain hosting, in order to reduce the burden on the site. Subdomains are primarily used to separate sections on websites. For example, news sites often move their forums on a subdomain, so as not to confuse information with discussion. Also, search engines are their additional services, such as the post office, the subdomain.

Position in search results


Position in search results is the location of a particular site in the search results for a given query, which is a key phrase. This item is dependent on many factors, among which are:





  • The place where the user is located (geolocation).
    Google to provide the most relevant results, it automatically detects
    the location from which you are asked the question and displays the
    results on its basis.

  • Google Dance. Google's algorithm determines which databases the user's browser connects to. Information obtained from various databases may be different as the update does not proceed at the same time.

  • Customize results. Google collects information about search preferences through cookies or through Google services.
    When a user is logged in, the search engine gathers information about
    what users search for, what pages visited, etc. To disable the feature
    adaptation of the results, click the "Disable adjustment based on search
    activity."

    Using a variety of browsers, which are written various stories of
    browsing the Internet also affects the personalization of results.

  • Search Filters. Such a filter is to search with the "Polish only" or selecting a specific region. Note the default search filters or introduced as needed and modify them accordingly.

  • Recommendations G+ 1 They also have an impact on your search results.
    Once logged in Google user searches for information, some brief
    descriptions in Google's search results can be tagged with his friends.
    Content recommended by friends are considered more valuable referrals by strangers or niepolecanych at all. This feature also personalizes your search results.

  • Handling setting, with which the user specifies how many want to see the results on one page. The default is 10 items enabled with dynamic search.


Browser


Web browser
(called Browser to browse from - view) - software to receive, develop,
and view information on the Internet on websites, identifiable by URL
and being a carrier of content, graphics, and other objects.


Some types of web content (images, text) may contain links to other sites on the web.
With the help of search engines and hyperlinks, you get the ability to
quickly move between pages sites and find the information you need.




Principles of operation The main task of browsers to download and display the information on the website. The preparation is carried out in the following way: you type in the address line is the URL of the page you want to view. The browser sends a request to the server to obtain information that is then passed to the program. At the same time followed by reformatting the data from the HTML document interactive.

Most Internet Explorer Internet Explorer (IE) was one of the first graphical browsers developed by Microsoft. The first version of this review appeared in 1995, and since 1996, IE 3.0 has become part of the Windows 95
For the last ten years, the Internet Explorer browser was the leader in
terms of installed copies, but from 2009 began to lose popularity (it
is currently the second most popular search engine in the world and
third in Poland).
However, the program remains up to 90% of the market in the UK, Korea, China and the countries of North America.

Pros browser:


Cons:


Google Chrome


A relatively new browser, which appeared in 2008, is currently the most
popular in the world (in Poland persists priority Mozilla Firefox).
As it is based on Chromium browser uses the free and the WebKit engine, which is used to display web pages. With Chrome users will get access to new, heretofore unavailable in browsers function. Google Chrome can be installed on Windows operating systems, Mac OS and Linux.

Pros:


Cons:


Mozilla Firefox

The first version of Mozilla Firefox appeared in 2004. Currently, it occupies the third position in the world in terms of popularity in Poland - first.
Mozilla Corporation is constantly working on improving the browser, not
only for Windows but also on Mac OS X, FreeBSD, Linux, and others.


Advantages:


Cons:


Opera

The first version of Opera, appeared in 1995. Initially it was distributed directly from the limited license (shareware), which could affect its low popularity. Ten years after the appearance of the browser rules for its use have evolved and received the status of a freeware. One of the advantages Opera is its adaptation to mobile devices (versions of Opera Mini and Opera Mobile). Currently, it occupies the fifth position in the world and fourth in Poland.

Pros:


Cons:


  • Traditional way to get the time with Windows;

  • ease of use.

  • Not all web standards;

  • small browser performance;

  • common security problems.

  • high level of safety;

  • high speed operation.

  • The minimum amount of built-in functions;

  • browser does not work on systems earlier than Windows XP.

  • many built-in plug-in;

  • availability of multiple extensions;

  • ability to change the appearance.

  • a number of gaps in the safety program;

  • high efficiency and speed of work;

  • convenient navigation;

  • the ability to install multiple plugins;

  • suitable for all mobile OS.

  • it is not possible to make certain settings, using the graphical interface;

  • problems with the operation of certain plugins.


Friendly URLs


Friendly URL is a unique web address that briefly define their content and are clear and easy to read for users.
Most content management systems to create pages gives them addresses
consisting of chaotic and illogical (from the point of view of a human)
character set, for example:
http://site/catalogue.php?sect=9&kind=34&manuf=samsung.
Of course, the code is understandable only for developers and not for regular users. The concept is to develop a friendly URLs short, laconic, available for Internet addresses. The same address in a friendly version will look like this: http://site/catalogue/phones/touch/samsung/.



Impact on SEO-friendly URL



Friendly addresses a positive effect on growth position in search results. This is due to the fact that they generally contain keywords, which is well received by the search engines.
Accordingly friendly URLs organized structure significantly helps
people find their site interesting and mostly on information from such
sites users deploy links.
However, long URLs with an incomprehensible sense repel and deter visitors. It is worth noting that the address formatting to be carried out immediately after the creation of the page.
Subsequent changes may result in the temporary disappearance from the
search results - search engines need time to crawl again changed links
and remove old ones.




Set the friendly URL

Most CMS
systems makes it possible to automate the process of making friendly
URLs that are created based on the headlines placed on those pages.




Pros friendly URL:


  • Users can better understand the structure of the site;

  • Such addresses are easy to remember;

  • if you need to go to level up enough that visitors wear off unnecessary part of the address;


  • when you return to our site, you can immediately get to it interesting
    chapter, omitting unnecessary hand, guided only addresses the previous
    questions;

  • increasing position in search results.



Less:


  • increase the load on the server.


Snippet


Snippet (called snippets - section) - a small piece of text wyświelający next link in the search results. In other words, it is a short description of the web page corresponding to the request for a search engine.



Snippetu role in SEO



In the snippet are highlighted keywords from the query to the search engine. The content of the snippet, often enough to get the information you need, without having to go to the page.
Snippetu role is often underestimated, but to motivate the user to
enter the website, you should take care of the eye-catching snippets.

Competition, even found a few items below, due to optimum snippetowi
can enjoy greater popularity than the party in the first position
without polished snippetu.

The more accurate and concise answer to the question gives the snippet,
the greater the number of passes will record the page, which has a
significant impact on ranking high in the search results.




How to influence the content of snippetu


Snippet is formed by the search engine based on specific algorithms,
and its quality, as above mentioned, affects the amount go to the site
from search results.
The question now is how the positioner can influence its content. Basically, for the search engines recognize the underlying meta tag description and content of the page. Each search engine has its nuances, the knowledge of which allows you to manipulate the contents of snippetu.


With Google as the basis for creating snippetu accepted meta tag
description if search finds that it corresponds to the content of
queries and can be useful for the user.
Therefore, be sure to filling that tag. Google snippetu length is 160 characters.
The length of the meta description may exceed that amount, and the text
will be taken into consideration by the search engine as a whole.
From this it follows that affect the snippets on Google you can thanks to a valid description in the meta description. Just enter there two sentences containing both key phrase itself and its modifications. Also make sure that your content and positioning umiejętym for these key phrases.

Stop-words (stopwords)


Stop-words (stopwords) - a concept of SEO.
These are the words that add nothing to the text from the point of view
of logic and communication, and do not involve any substantive content,
the most common pronouns, przyimiki, numerals, etc. These words are
ignored by search engines.




The following list of commonly used stopwords:




and that, ah, though, but, oh, either, but, oh, no, even, more, much,
because, in fact, that they were, by no means, be, was, was, was, were,
will be, will be, inches, whole, whole, you, you, you, what, whatever,
something, sometimes, sometimes, why, if, that is, far, for, why, why,
to, well, nowhere, pretty, much, two, two, two, two, today, today, when,
if, for, where, somewhere, someplace, it, and, the, what, the, other,
other, other, other, that, I, it, like, some, like, which, for some,
what, a! What, however, as, somehow, it, one, one, one, however, but,
its, its, to him, is, I am, yet, if, if, now, it, all, When, several,
someone, anyone, anyone, who, who, whom, whose, who, whom, where, who,
for, years, but, or, have, have, little, I, I, even between, me, me,
can, my, my, my, my, can, can, can, my, it must, we, on, over, us, us,
us, our, our, our, our, our, our, however, immediately, even her,
nothing they do not, let, him, her, him, never before, them, than, well,
the next, from, around, he, she, they, they, it, and, behold, yes, you,
you, you, the, the, the, though, over, since, should, should, should,
should, in addition, the law, after all, in front of, above, before,
the, the, the, Also, alone, alone, are, now, where, each, other, meaning
their, that, as such, such, such as, there are, of this, this, that,
now, also, that, you, you, so, you, here, here, your, your, your, your,
thy, your, you, the, only, the, in, the, you, you, you, yours, yours,
yours, we, according to many, many, more, more, all, all, all, all, all,
the, you, just, with, for, perhaps, forever, from, zł, again, again,
was, no, no, no, any that to.


The word "no" is not a stop-word, it means the plant.




Due to constant changes in the algorithms, the list of these words
requires regular updates, because each search engine has its own
database of stop-words.

Therefore, creating content for a website worth paying attention to the
frequency of stop-words and their stosunkek the total number of words
in the text.

Page wikilinked


A page of links - this page positioning, which have links placed on the website link.



The mechanism linking

The linking page and links are closely linked, and both are involved in the process of positioning.
Combining these mechanism can be summarized as follows: on the links on
high PageRank, enjoying the zufaniem search engines deployed is a link
to a page of links, which provide a piece of the linking page to be
valid the linked page, but its validity is not affected.
This allows a page of links is gaining in importance, which, however, does not provide the link-back. However, as a consequence of a page of links can be a party links to other recipients.
Positioning by linking remains the most effective and popular method of
positioners, allowing a natural increase ranking high in the rankings.

Search engines notice that the young side has a high citation index
refers to the number of zufanych services, therefore, the page will be
higher in the search results.




What harms the linked page



Harm when positioning the linked website by linking to the wrong strategy can gain links. First of all, should not be allowed to appear in a large amount of inbound links in too short a period of time.
This occurs when the positioner is buying at once disproportionately
many links to pages maintainable, which is not typical of their natural
growth.
Secondly, the negative impact on the linked page will affect the placement of links on the pages of low quality.
The linking site should have a theme similar to the linked pages, the
appropriate number of external links organically entered in the text and
should enjoy the trust of search engines and Internet users.

The linking page


The linking party - the party transmitting part of its value other side, the so-called. the linked page. The statistical value is transferred by means of links placed on the link.
The role of the pages maintainable in positioning is high, because the
strong parties may provide high value, which has a positive effect on
the positions of the parties related to the rankings.






Search pages maintainable

Before the positioner is the task of finding relevant web pages during and deploy on their link on your site.
Search page, you can manually (in thematic directories, social
networks, forums), which is quite time-and labor-intensive, as well as
by specific marketing systems links.

The marketing systems links webmasters for a specific payment available
space on their web sites for the placement of links on the sensors.


Changes related links sales systems should meet a number of criteria:




  • high PageRank;

  • Domain age appropriate;

  • a subject close to the linked page;

  • appropriate height of the domain;

  • amount of external links should not exceed three.


Doorway pages (doorway pages)


Doorway pages
(called Doorway Pages, The Bridge, The Gateway, Jump Page, Portal Page,
Zebra Page, Entry Page) - websites, created to redirect visitors to a
specific website, or forwarded to their reputation by linking.

Such sites can not be considered wholesome - they are supporting
platform, designed to support the proper positioning of the site.

According to the classification of ways of positioning, the transition
can be determined by color, depending on the aims and objectives of
their creators:






  1. "Black" doorway pages




Designed for automatic redirection, usually via a Meta Refresh command
to the appropriate service, which explicitly violate the recommendations
of search engines.
This page does not contain structured text - you never see her again.
Its content is generated by special programs interrelated set of
symbols with key words and phrases whose density is usually very large.


Operating principles:


With the help of special generators created a lot przepozycjonowanych
free hostingach pages that link to each other, slightly improving on the
occasion of its parameters.
By linking and przeoptymalizowaniu, such sites rise to the top positions of search rankings.
Internet users, adopting them as part comply with their request,
clicking on them, after which they are automatically transferred to
another service, although often with a preset theme something in common,
but located far in the search results.

Search engines absolutely oppose a black doorway, constantly improving
their algorithms and if you encounter does not automatically impose a
ban on them.

Therefore, positioning using them is becoming less popular, but their
simplicity and efficiency continues to attract impatient sensors.




  1. "Grey" doorway pages



They are designed to transfer the right news services of inbound links. On pages doorwayowych of this type is established automatically redirected and they are not optimized for keywords.
Therefore, search engines rarely bring these pages to search results
because their content can not be fully compatible with the queries (due
to lack of key phrases).
Nevertheless, gray doorway pages contain logically structured text, semantically close to the subject of proper maintenance.
The text is inserted after a couple of links on the right side, which
add its value, which is necessary in the process of positioning.
Then, the address of that page as spam is sent internet users, which are used to address databases.
The search engines is difficult to combat the gray doorway, as a matter
of fact they do not infringe the rules: do not automatically redirect
users to other URLs do not contain przeoptymalizowanego and illogical
text.

Nevertheless, however, they can not be regarded as fully compliant with
the guidelines because users become aware of them with spam, which is
opposed by the search engines.




  1. "White" doorway pages



These parties do not violate any guidelines of search engines. They contain valuable information, often of a commercial nature and the links on the right side. We can say that these are the main site advertising services.
The boundary between white and gray doorwayami is often contracted as
white are also often spread by spam, because traditional methods consume
a lot of time and money.




Value to search doorway pages



The search engine referrals for webmasters always enters a ban on the use of false doorway pages, or black doorwayów.
If you find them through the search engine they are immediately banned -
thrown out of the search engine's index and, therefore, from the
results list.

Search engines usually do not have a reason to apply sanctions against
the gray and white doorway pages as they do not violate the
recommendations, in particular, do not send out spam directly.
It is worth noting that the parties, in which direct doorway, they are punished. An exception may be cases where the doorway is on the subdomain of the main site.

Title


Title - one of the most important tags on a website, which contains the header of the HTML document. By accessing the site, the search engine robots to learn its contents are guided first of all to him.



There are a few useful tips on the header title:





  • The header should reflect the essence of pages and be written for the user, not the robot.

  • Header length should fall between 50 and 60 characters.

  • The header should contain at least one of the key phrases.

  • Key phrase in the title tag must correspond frazom in the text of the page.

  • Each page of the website should have a unique title.

  • The header connectors are undesirable, prepositions, pronouns, etc.

  • Do not make the title as special characters.




Adjusting to these simple rules can satisfy not only the search
engines, which will therefore be better to crawl the site, but also
users who are going to the page with the title "Car Auctions" will bid
on the cars, not the advertising of cosmetics.

Otherwise, users will immediately leave a website that does not meet
their expectations, and thus will increase the bounce of the page, which
has a negative impact on the position in search results.

TOP


TOP (from called top - top, top, top) - Page, located on the first page of search results for a particular query. Introduction to the masthead is the primary goal of positioning. The various search engines TOP may be different, which is associated with differences in search algorithms.
It is possible to simultaneously achieve the masthead on Google, Yahoo
and other search engines, however, should take into account the specific
analytical tools for each of the search engines, which, though in many
respects, they often differ radically in the way.





Pozycjonerzy use terms such as TOP1, TOP3, TOP5, TOP10, to determine
the appropriate ranking leader, the first three results, Fridays and
dozens of pages.

Continuing the calculation, such as TOP20, there is no particular
meaning, because the search engine in response to a query by default,
appear on the 10 results per page.

Therefore, the introduction of service to the TOP10, or on the first
page of search results, it should be the goal of a minimum effective
positioning.




The advantages for the parties in TOP10


Most of the users when searching for information, reading only the
first page of results, then change the query more specific or general.
In this way, the occupying top positions in the search results objectively attract more attention. This allows you to quickly achieve the objectives for which the site was created.

Minus one - sites located in Topi be continuously monitored so as not to fall in the leading position.
To prevent this, you should continue to use pozycjonerskie action,
although less intense than the introduction of parties to the masthead.

URL


URL
(Uniform Resource Locator) - a standardized format for addressing
resources, standardized form of writing addresses on the Internet.


Each URL is unique and shows the location of the network.
Initially, the creator of URL Tim Berners-Lee wanted to use the
identifier to determine separate files in local area networks, and later
in the global Internet.
Currently URL has its own standard (RFC 1738) and used for the determination of any files and nodes in the network.





URL format
URLs that are the traditional form of writing: <protocol> :/ /
<contenttype> strony="">. <username> Domeny="">
<port> / <path URL> / For example: http://www.example . com:
8080/somepath.php / </ port> </ name> </ type> </
protocol>

<protocol> <contenttype> strony=""> <username> domeny=""> <port>





  • Protocol - Specifies the type of data transfer: http - plain text,
    https - the message text over an encrypted connection, FTP - file
    transfer protocol, mailto - e-mail address.

  • Page type - specifies for which browser page is adapted.
    According initially accepted standards all URLs beginning with the
    letters WWW, allowing to identify the site as a resource available on
    the Web with the help of a regular browser (for mobile phones was
    subsequently adopted abbreviation wap - Wireless Application Protocol).

    Now this principle is used much less frequently, and if, before the
    page name is not shown the type, it is assumed that this is the site for
    a regular browser.
    If the site is suitable for viewing on a mobile device, or have it both ways - wap and web extensions are shown.

  • Domain name - a unique resource address symbols expressed in the network.

  • Port - port number to access. Any web application has its own data exchange protocols, which are connected to specific ports. Http protocol works with TCP ports 80 or 8080. If the server to which the request is sent, there are just web pages, the default port does not show up. If the server has access can be obtained, for example, ftp, it should still show the port number.

  • URL Path - indicates the exact position of the home page. Recording Form URL in the URL initially could only use the basic Latin alphabet, numbers and some punctuation marks. Now, with transcoding capabilities, the URL may consist of symbols of many different alphabets, Arabic, Chinese, Cyrillic, etc.




  • Internet services needed to simplify working with URL
    Among the disadvantages include URL lack of clarity, length, and lack
    of flexibility, the parties may change or disappear, and the URL will
    continue to lead to a non-existent page.
    Therefore, the resulting Web services, providing the opportunity to simplify the URL:



  • PURL (called Persistent Uniform Resource Locator - permanent tenant URL). PURL provides specific databases to protect URL. When the output link is changed, this information reaches the database and the necessary changes are introduced. In this way, the external service address remain unchanged.
    This is the preferred method for pages with dynamic content that can
    often change and change their position: search systems to index page
    with the URL PURL and even if you change the output path, file or page
    can be found on the server, and the page will not lose position search
    results.


  • Short URL - generic name for Web services, which can significantly
    shorten the length of the URL, which is possible by creating aliases
    (synonyms) for the final URL of a short domain name.


  • Friendly URL (friendly URL) - URL is easy and convenient for the user's
    address, which greatly facilitates the use of it, not only by the
    Internet, but also by the search engine robots.

    Getting friendly URLów is possible thanks to so-called process of
    rewriting pages (called rewriting), which is usually configured in.
    Htaccess file.
    Many CMS that offer this service, such as Joomla! for each site is assigned a number corresponding to the number in the database.


  • Usability


    Site Usability (called usability
    - usability, ease of use, ergonomics) - property maintenance, that
    dictates the ease with which a user can navigate between the parties and
    to work in his intefejsie.


    For usability of a number of factors, such as:



    • Nauczalność (learnbility).
      He talks about how fast the user who first sees the interface service
      will possess the skills necessary to perform simple tasks and functions.


    • Efficiency of use (efficiency) - determined on the basis of convenience
      and the speed with which the visitor will move in and do the job.

    • Zapamiętywalność (memorability). Specifies the ability to quickly memorize the principles of the interface and its navigation. If the next entry you must re-learn the rules of navigating the site, it may mean that it is difficult to remember.


    • The number and severity level of error is a factor determining how
      often people make mistakes when moving around the site, how they are
      important and what are the possibilities of repair.

    • The subjective feeling, the personal service evaluation by the user.



    The importance of usability


    Research professionals, such as your User Interface Engineering Inc.,
    shows that the search users in 60% of cases they can not find you're
    looking for.
    This leads to lower productivity, loss of valuable time and a decrease in the number of visits the site.
    However, according to data provided by Forrester Research, "thanks to
    poor usability stores are losing 50% of the buyers, 40% of which will
    never come back." Studies on the behavior of users on the Internet show
    that visitors receive bad services not loading in a complex structure.
    Internet users do not want to wait for the complete page load, and spend time to understand the rules of navigating the site. In today's Internet, there are no such concepts as manual repair or exchange service page.
    You want to understand the functionality of the service in the first
    sekunadach explore page, or instantly know how to navigate through the
    site, and immediately save the structure and functions of the service it
    offers.
    If this does not happen, the internet just moves to a different, more rewarding him direction.
    A manifestation of concern for the usability is to design a site that
    is dedicated to the user, which in turn affects such important
    indicators as the frequency of visits and conversions.




    Design and Usability Testing

    The process of creating a functional and usable website can be divided into two stages: development and testing.



    • Designing Usability (Usability Engineering) is part of a methodological
      approach to creating a functional and user-friendly web interface
      interentową.
      The result of the work at this stage should be properly functioning, efficient and easy-to-use service. Designing usability consists of several stages, of which the most important are:

      • collect data and requirements for the proposed service;

      • development and testing of a prototype;

      • evaluation of design alternatives, and navigation service;

      • analysis of the problems that you may encounter;

      • provide solutions and testing them.



    • Usability Testing (Usability Testing). This is the final phase of the development site usability. The standard test is performed by the user - Scan a certain amount of tasks using the prototypes. During these tests recorded everything he does and says tester. A typical test is performed by at least two users working together. As a result, testing should be collected:

      • information about the order of steps performed (action) necessary to achieve this objective;

      • Information about the mistakes made by the user;

      • because of the difficulties that you encountered;

      • data rate performance for the tasks;

      • subjective evaluation of the user's activities on the site.





    The purpose of testing is to detect potential problems which may be encountered users and to find methods for their solution.


    The steps for creating a functional and convenient to use website




    • The preliminary stage - at the initial stage, you specify:

      • goal that guided the creation of service;

      • target group of Internet users;

      • characteristics, determining the members of the target group and the objectives for which they visit service.
        These goals are generally the same for most sites, but depending on the
        subject, which moves a particular service specialists who design
        usability can get a additional targets.

        In the process of goal setting should be remembered that the service
        should be as simple, easily digestible and efficient to use, easy
        zapamiętywalnym and satisfactory from the point of view of the
        subjective assessment of the visitor.



    • A collection of data and requirements
      (for existing sites) - This step is the collection of information from
      users about their requirements for service and assessment of how the
      current website project that corresponds to the requirements.
      Data can be collected from users by:

      • form and return form;

      • analysis of server logs;

      • Usability testing of an existing service.



    • Development of a prototype. Well known that the study of impacts is much easier for a real object, and therefore are created prototypes of websites. With their creation uses a minimum amount of content and does not load graphics.
      At this stage, the most important is to analyze the opinions and
      comments of testing users and to determine how the prototype of the
      website is able to perform set tasks.

    • Create content and its optimization. One of the main criteria of good usability is the attractiveness of the content to the user. Out of all the information gathered on the site is placed ones are really needed, interesting and understandable for the user.
      Most Internet users during the first few seconds of viewing, scanning
      visual information arranged on the site and read only the ones that
      caught their attention.
      The text on the page is split into headings, titles and subtitles. In order to facilitate reading and sorting information applies paragraphs, lists, tables.

    • Organization-page navigation.
      Main criteria determining the quality-designed navigation is based on
      simplicity and ease with which users can navigate through the pages of
      your website.
      Therefore, the navigation should be the same on all sites, intuitive, understandable and easy to remember.
      Testing usability testing is an interactive process of detecting the
      factors that help the user, and those that hinder him, carried out in
      order to exclude the latter.

      Testing is typically performed several times, until the development of
      the final variant, corresponding to the requirements of usability.


    Webmaster


    Webmaster (publisher) - a specialist, who create websites and web applications. A good editor has the ability to typeset, programming, administer the site, design, design, content creation and positioning. The term webmaster appeared in 1992 in the text of father-Fi, Tim Berersa-Lee "Style Guide for Online Hypertext".



    Scope of work

    Profession webmaster as such does not exist, is not officially marked and is not taught in universities as a separate course.
    Usually, webmasters have considerable technical expertise in
    programming and administration, in addition to gaining skills in
    webdesign and positioning.
    Webmasters can work for companies, for example, in studies of webdesign or remotely, such as freelancers.

    Expectations for qualified webmasters are generally high and the specialist should have a wide range of skills, such as:






    • the ability to use at least one programming language (eg PHP), ability
      to work with someone else's code and the knowledge base of other
      languages ​​(Perl, Python, Ruby, etc.);

    • basic knowledge of HTML and cascading style sheets CSS;

    • understanding of network operation HTTP and CGI standards;

    • ability to work with graphics programs;


    • conceptual thinking - the ability to think about the logic and the
      development of interaction between the modules and scripts, as well as
      knowledge of CMS systems.

    • knowledge base of knowledge in the field of internet marketing and use them in the creation of the usability;

    • knowledge of optimization and positioning.



    As you can see, a list of competencies publisher is very long and it is by no means the complete list. Of course, having a high level of competence in all of these areas is not possible.
    Because of that, large companies employ a few specialists, responsible
    for specific areas: the composition of text, design, programming,
    project management, etc. In any case, the webmaster should have at least
    a basic knowledge of the entire process of working with the website,
    since the website is essentially homogeneous body, the smooth operation
    could only be guaranteed by the publisher.
    Similarly, a freelance webmaster can not be responsible for all phases of creating a page.
    For small sites it is possible, but the big serious project may require
    the participation of many people with different competences.
    In such cases, the role of the publisher is to coordinate and control the work at every stage.



    Education


    Admission to work on the position of webmaster usually does not decide
    diploma and technical courses, or the police work, but the portfolio -
    the number of completed projects, their scale, and that for whom they
    were made.
    Publishers can also apply for a certificate, issued for example by Microsoft (MCSD, MCSE + Internet and MCDBA).

    Search Results


    Search results (SERP - Search Engine Resultant called Page) - This is the page where the search results are displayed for a specific user query. There you will find links to websites with content that match your query, along with a brief description of the page (snippet). Links are arranged in the form of a ranking list created on the basis of compliance with a given query.
    After entering the search query the user looks at it from different
    properties (linguistic, morphological, geographical, etc.) and according
    to the results of search algorithms build, display the page in the form
    of a list of pages in order of their compliance with the request - the
    first most relevant to the query.

    Search results in response to the same query may differ in various
    search engines, which is caused by not using different search
    algorithms.
    Each system has its own unique way to select the most appropriate response based on certain factors.



    The search results page consists of the following components:





    • List URLs matching the user's query. It is a basic element of performance, called the organic search results. For this item also apply images, videos, maps, news, and audio files that are included in the search results.


    • Paid advertisements (called sponsored links), which are presented in
      the form of standard results before the organic results or as short text
      ads placed on the side of the list.

    • Links to more places in the ranking of the results. Results Page 10 contains most of the organic search results. The main items in the search results are called TOPEM. The first page of results is TOP10, the second - TOP20, etc.


    • Under the search box, some search engines are displayed short blocks of
      information generated at the time of writing queries and give a quick
      answer to the question.
      In different systems, these blocks have names: Google - OneBox Yahoo - Shortcut.
      This may be a suggestion to correct an error in the query block to the
      weather forecast, the definition of the term, calculator, etc..



    What affect search results


    Search results are not permanent and changing phenomenon depending on the dynamic changes occurring in the network.



    • Changes in Web documents: the appearance or removal from the search
      engine's index, and as a result of transformation - change document
      compliance with the request.

    • Updates (update) search engines.

    • Changes to the search algorithm.

    • Geographical person. Search results may be regional, as between the parties compete with one region. Region query is determined automatically based on the IP address.
      Within the search results, without regional parameter (geotargeting),
      the position will be different than in the regional results.

    • Populating the unique content. Pages created for the user-friendly and they are favored by search systems.


    Web Search


    Internet search engine (Search Engine) - a program with an interface for users to search for and display information according to the user's query.



    The types of search engines:


    • Global - designed to search for information on the Internet;

    • Local - searches the local network or specific services.



    There are the following global search systems:



    • Universal - search engines give your users the ability to search the content of any type: text, graphics, audio, video. Simply searching in network resources. The world leader in the midst of universal search engine is Google. Besides him, there are quite familiar Yahoo!, Bing, Yandex, Baidu. In Poland, the Google search engine handles about 98% of the requests. In addition to self-service Google.pl, the company serves customers such sites as Onet.pl, Interia.pl or Gazeta.pl;

    • specialized - their job is to find information that meets these requirements. These systems find files on FTP servers, goods in online stores information in Usenet (newsgroups worldwide system);


    • ideas - a search engine such seek only the information on the Internet
      that may be of interest, certain social groups (religious, professional,
      etc.).



    The structure search


    The term "search engine" is usually understood as a global, universal
    search and will continue to question just about these systems.
    Schematic structure of most search engines is similar and there is no significant difference between them.



    Interface

    Seen by some search engine is a web site designed to interface to query the search engine. There also displays search results for a given query.

    Software designed for indexing and retrieving information


    • search algorithm;

    • Database page addresses and information about Internet and intranet resources.



    Search algorithm is an active part of the search engine, and his duties include:


    • indexing of websites and their content;

    • rank your websites and web pages;

    • the formation of the search results.




    The database (index) used for storing addresses of known search engine
    sites and their pages, as well as the content, links or other
    information contained on them.

    The index is divided into chapters, and placed on multiple servers
    distributed around the world (in the case of major search engines)
    connected to the network.




    Principles of how search engines work

    Indexation



    Search algorithm operates continuously, 24 hours a day scanning the
    global network for new resources going after it finds, and links and
    adding (indexing), the new addresses and information to the database
    (index).
    Pages indexation should meet certain requirements concerning:


    • uniqueness and quality (accuracy, the value of information and structure) content;

    • the quantity and quality of links that the page;

    • user activity on the site;

    • lack of malware


    • Content compliance with certain requirements (for example, prohibiting
      the publication of materials breaking the law, incitement to terrorism);

    • compliance with certain rules engine optimization.



    Create a page ranking and displaying search results


    In response to a user query, job searching an index search system scan,
    find and offer you addresses of the parties, which is listed in your
    search term, or a combination of the words as the key phrase.
    If you do not meet the key phrase query, the search engine chooses the side closest to the content.
    As is usually the amount in accordance with the query page addresses is
    very large, before the search is the task of laying the respective
    ranking of those websites.

    In other words, due to the fact that search algorithms must provide the
    user with the opportunity to get acquainted with the most corresponding
    query responses (which in practice is generally not possible because of
    the very large quantities), the creator of the search engines took the
    decision to display the results in the form of a structured ranked as
    the Address Book, the leaders of which are sites with the best
    performance, and further in the list are arranged side by deteriorating
    indicators.
    Search results pages is a list of addresses. Furthermore, there is displayed a short text information content, the so-called snippets.



    Penalties imposed by the search engines

    If you find the job searching sites use to position the unlicensed technology, penalties may be imposed:


    • lower position in the ranking;

    • remove pages from the index, the so-called ban.




    The technique permitted and may lead to the application by the search
    engine penalties against parties are not limited to: Black SEO,
    publishing unauthorized material, malware distribution, etc.




    Priorities in the development of search engines


    Search



    The steadily growing number of pages wyszukującymi systems poses the
    task of developing more and more new ways of organizing data and search
    algorithms.

    One solution developers search engines see the grouping of documents -
    automatically creating multiple groups of semantically similar
    documents.

    At the same time, the criteria for selecting the groups are not known
    in advance, but are determined automatically based on the similarities
    observed to date.


    Displaying the results

    All search engines are geared up to present the widest base of answers to user questions.
    Therefore, the search algorithms are improved in such a way that the
    results formed the most appropriate query pages, the content of which is
    the most interesting, well organized, unique and carrying a lot of
    information.


    Accurate and comprehensive results


    Indices major search engines contain billions of addresses, and the
    volume of information that can be found on their sites is hundreds of
    thousands of gigabytes.
    In addition, the major search engines also offer the ability to search images, audio and video files.


    Price performance

    In addition to basic search algorithms, such as Panda on Google, the search engines also use more sophisticated algorithms. An example is the algorithm Fresh whereby Google constantly scans messages and can crawl them after a few minutes of the event.


    Speed


    The time needed to develop a search engine query and formulation of the
    results is one of the most important parameters that determine how a
    search engine, which their creators always try to minimize.
    At present, the rate of development of a single query in the leading search engines is approximately 1/4 second.

    Index search


    Search index
    is a database search engine, in which a structured fine saves and
    stores information about Web documents collected by search engine
    crawlers.
    The process of adding documents to the database search system, organizing and storing them is called indexing. Each search engine indexing performed by their algorithms.

    What is included in the index

    The content of the index is structured data consisting of key phrases,
    text and multimedia elements, links, so that shortened the time it takes
    to search the database and find the documents in accordance with the
    request.
    This database is constantly updated using crawlers that continuously scan the web for new pages, content and resources.

    Adding a site to the search engines
    In the process of positioning the indexing plays an important role. The faster service is indexed by a bot indexing, the sooner you can expect that it will appear in search results. To speed up the indexing, you must notify the search engine of the existence of a new page in the network. For this purpose, all search engines exist function (adurl), where by filling out a special form pages are added to the index. The positive impact can also create a profile page on social networking and dissemination of information in their website.
    Management is indexed pages using robots.txt.

    Krisis Moral di Era Digital: Analisis Kritis dan Strategi Penguatan Nilai Kemanusiaan dalam Ruang Siber

      Krisis Moral di Era Digital: Analisis Kritis dan Strategi Penguatan Nilai Kemanusiaan dalam Ruang Siber   Ringkasan Eksekutif Laporan ini ...