User-Generated Content – What Is It?

   |   By  |  0 Comments

user generated content

In the past, in the world of marketing, the advertiser spoke and the audience listened. Today, the relationship between the seller and the buyer has changed dramatically. Thanks to the Internet, we’ve said goodbye to the imbalance in communication. Monologues have been replaced by dialogues, and what the client says is now sacred. Why? Because User-Generated Content pays off. What exactly is it and how to use this tool in marketing strategies? Let’s find out!

What is User-Generated Content?

It’s content created voluntarily by users, not by brands. It can take different forms ranging from various types of text and pictures to videos. According to the OECD definition, UGC must be at least partially “creative”. And… that’s pretty much it when it. As you can see, it’s a very broad definition.

Types of UGC

User-generated content can take many forms, most commonly:

  • social media posts – including comments under brand posts on fan pages, reviews, or creative responses to marketing communications,
  • posts on internet forums,
  • reviews,
  • vlogs,
  • blog entries touching upon topics related to a given brand.

All kinds of user opinions are connected with the brand.

The Value of User-Generated Content

Let’s stop and consider – how do you make purchase decisions? What affects you more: an ad that promises product X is the best in the world? Or a comment from a user on a brand’s fanpage or an opinion portal, honestly pointing out the advantages and disadvantages of a given solution? Which is more credible to you? What kind of content are you looking for yourself? This is why UGC is so important.

Below is a list of the advantages of this type of content.

  • Credibility

According to Nielsen research, 92% of people are willing to believe an opinion heard from a person they know, and as many as 70% find online opinions credible. What does this mean? An authentic message coming from a “normal user”, not a marketer, is much more credible than an advertisement.

  • Authenticity

Who still believes ads present a true picture of reality? Traditionally, they’re widely regarded as sugar-coated and exaggerated. No wonder – it’s their “poetry”. But what about authenticity? It’s significantly higher in reviews and other content about products created by users.

Verseo Ads Banner
Verseo Ads Banner

This has given rise to the popularity of “expectations vs. reality” style photo comparisons of photos from promotional materials and pics taken after purchasing the product (like a fast-food chain hamburger or clothing from an online store). As a result, blogs with reviews and channels of this type on YouTube are becoming more and more popular.

  • The Value of User-Generated Content

User-generated can tip the scales toward completing a transaction. It’s highly probable, especially if clients are in the brand’s target group and are already interested in a given product. This has been confirmed by research – as many as 80% of respondents agree that UGC has a significant impact on decisions to buy specific products.

  • Brand Awareness Activities

UGC builds brand awareness – arouses interest in products, creates positive connotations, and strengthens the bond between the consumer and the company. These relationships are not purely “business” – especially since much of the content generated by users is created under the influence of emotions (both positive and negative!). Therefore, they build long-term engagement.

  • Inbound Marketing

A model in which customers come to a company and create buzz. This is a much more effective form of acquiring a consistent audience – one that is loyal and returns time and time, attracting new customers.

  • Relatively low costs

It’s difficult to estimate the costs of generating relevant UGC content. Coca-Cola paid a lot for creating beverage packaging with personalized labels that very quickly became online and offline stars. A company that creates a simple contest on Instagram will pay a different price. One thing is certain: skillfully (and honestly!) cast fishing line will hook profits.

  • Strengthening SEO activities

In the SEO world, “content is the king”. But not the first better content, but the one that has real value for users. Google’s algorithms no longer focus solely on the number of keyword repetitions in content. Yes, it is still important, but not the most important. What matters now is the naturalness, user-friendliness, and usefulness of the content. Therefore, if e.g. users post links to a brand’s products on their social media profiles or blogs, the chances of ranking higher increase significantly, while at the same time reducing the cost of website positioning.

  • An excellent source of feedback

The goal of every company is to create products that are eagerly purchased by customers from a specific segment. However, it’s not you but your recipients who know what they expect. You can try to shape their needs and point to available solutions. But you also have to listen to feedback, and not just the songs of praise, but above all – the criticism. UGC is a good source of such information. You can often find valuable suggestions in the comments and reviews that allow you to change your strategy to work better for your customers and your business. This shortened list of benefits that can be drawn from User-Generated Content already shows that it is worth taking care of its creation.

Creating User-Generated Content

Create an atmosphere that invites people to share impressions/experiences and thoughts. What products would you be willing to tell your friends about? What do you recommend to them? In what situations are you willing to take the time to prepare a review of a hotel you happened to be in? There are specific emotions associated with these experiences.

New call-to-action
New call-to-action

People who will be willing to create UGC are a very diverse group. There will be negative feelings and global views. There will also be advice, like in a Facebook group or on a forum. Some users can be encouraged to create UGC, some will motivate themselves.

But how can you influence the frequency of UGC generation?

  • Social Media Contests

This is a simple way to increase the buzz towards your brand. Prepare a contest for users in which you offer them valuable prizes, e.g. for preparing a short post related to your products. You can also reward them for writing the most interesting review, creating a photo related to the theme, a short story, a poem… The more creativity you unleash, the better the entertainment. Sometimes the opportunity to review the works of others becomes a better reward than real prizes!

  • Encouraging Review

This was recommended for online stores a while ago – reviews used to be more important than low prices. Today, thanks to how the feedback system on Google works, web users are looking for reviews of almost everything from restaurants and bars to entertainment venues, offices, stores, up to tourist locations. Encourage customers to leave reviews. How? You can simply ask them to leave a review by handing out a leaflet encouraging them to leave a review or sending an email with such a suggestion. You can also offer additional incentives, such as discounts on future purchases. Don’t just reward those who leave the highest ratings and raving reviews! Remember: frank opinions and authenticity are the secrets to the power of User-Generated Content.

  • Engaging Social Media Posts

If you have a large enough user base on social media, asking an engaging question will be an easy way to get a huge amount of user-generated content yourself.

Sometimes a simple question about how their day is going will suffice. In other situations – a serious question about their opinion on an issue related to the brand’s business. But beware: such discussions can get heated! That’s why you need a moderator. Don’t forget: “moderating” doesn’t mean deleting unfavorable comments, but, for example, reacting to the emerging heckling and answering questions that will be raised against you!

  • Interacting on Social Media

Comments may appear on your social media profiles about experience with the brand or a specific situation that evoked emotions in the user. You can’t leave them unanswered. To generate UGC, include open-ended questions – ask and encourage your audience to “open up”. It’s also worth opting for a looser communication style – especially in response to negative comments. If you keep it lighthearted, your answers will become a topic of discussion in their own right. And that’s the point!

What to avoid when creating UGC?

There are many ways to inspire your audience to create content related to your brand. Remember, it needs to be done thoughtfully.

New call-to-action
New call-to-action

Avoid:

  • excessive encouragement to comment and share – if it appears too often and too pushy, it will discourage rather than encourage creativity and interaction,
  • Artificial crowds- it’s better to communicate openly and honestly, on behalf of the brand (or through its ambassadors) than to pretend to be ordinary users. False posts will be detected quickly, and the action will lead to defeat, not victory,
  • publishing UGC without the creator’s permission – for example, if you receive an email with a positive review of a company, before you publish it, ask if you can do so. The same goes for creative works sent to contests and other company events. Ask for permission to publish, preferably in writing, and only then brag about what you received,
  • create campaigns aimed at building User-Generated Content among people outside your target audience and in a style that does not match your brand – after all, the ultimate goal of such communication is to build a specific image among specific audiences.
  • If you want to know more about UGC and other inbound marketing tools and are wondering how to implement them in practice – let’s talk about it! Drop us a line!

Sitemap – What Is It?

   |   By  |  0 Comments

Sitemap

Good spatial orientation in the wild is not only a desirable attribute for travelers, but also computer programs of a special type – network bots. Unfortunately, most which are crucial for effective site indexing have not been endowed with a sense of navigation and require our help. Enter the sitemap, riding in on a silver stallion. 🙂

Optimizing a website consists of many steps, however, even perfect preparations are of little use if web crawlers don’t reach all the pages in the index… And this is where sitemaps come to the rescue. This inconspicuous XML file is an indispensable guideline which will allow web robots to quickly discover every important element of a page and thus accelerate their indexation and appearance in search results.

History of the Sitemap.xml Protocol

First mention of sitemaps appeared on the Google blog in June 2005.

The main task of sitemaps was providing Google with information about changes on the website and increasing the number of indexed subpages. From a 15-year perspective,  we can certainly state the task was successful – sitemap is still fulfilling its original role.

Google did not limit itself to increasing its crawling potential – the sitemap.xml project was granted a Creative Commons licence so that other search engines could use it as well. A year later Yahoo! and Microsoft declared their support for sitemap.xml, and in the following years Ask.com and IBM joined them.

Sitemap – What Exactly Is It?

A sitemap is a list of URLs we want to point out to web crawlers. XML (Extensible Markup Language) is a universal marker language used to describe data in a universal way, accessible from any platform and in any technology used by a given system.

The sitemap should include all addresses that are valid and public – home pages, information pages, category pages, product pages, etc. In the map file you can also – and even should! include addresses of subpages, which are not placed in the menu. The XML language allows describing them precisely, marking important data, like the date of the last modification of the subpage.

This method of presenting information about what is happening on your website allows you to quickly inform web bots about creating new subpages or changing existing ones. Giving crawlers a list of important domain addresses significantly speeds up indexing – the program does not have to look for all the elements on its own, it just uses the ready-made solution.

Contents of the Sitemap.xml File 

SItemap.xml – as you can see at the top of the screen – doesn’t have display guidelines (as pages created in html and css language), hence its form is a bit unfriendly to the ordinary user.

Above you can see two versions of the same file – sitemap of klodzko.pl website. The first graphic presents the view in the browser, the second one is a preview of the file code. Let’s decode them:

  • <?xml version=”1.0″ encoding=”UTF-8″?> – this provides the XML version used to create the file, the second concerns the encoding standard
  • <urlset xmlns=”http://www.sitemaps.org/schemas/sitemap/0.9″> – this line opens the tag in which all links contained in the sitemap will be placed.
  • <loc> – subpage address
  • <lastmode> – additional information, which contains information about the last modification of a document located in a given address.

There can be also other information here:

  • <priority> – the content of this tag should be specified in the range 0.0 to 1.0, and its task is to indicate to bots which subpages should be indexed first.

Sitemap – XML or HTML

Sitemap.xml is not the only “sitemap” that can appear within a website. The task of an HTML map is to show the user the way – e.g. by grouping links to the most important products. This solution is most often used by large websites:

These types of maps are created using HTML and CSS. They do not contain additional information, and links are placed in tags that create lists:

Verseo Ads Banner
Verseo Ads Banner

Are HTML maps useful? Certainly, they help lost site users get directions. They’re an additional place where links to various categories, tagged with keywords important to us, are located… which can be a nice support for internal linking. Creating a map of this type is worth considering for larger sites, but there’s no such need for smaller ones.

The Benefits of Making a Sitemap for Web Crawlers

The main task of a sitemap file is to show web bots the most important elements on a website. Creating an appropriate file is necessary in several cases:

  1. The website has many subpages – multi-page websites may have problems with fast indexing – before a robot finds all subpages and indexes them, it may take several weeks. A sitemap  indicating key points will shorten this time and place the domain in search results faster.
  2. The site is new – a fresh domain is usually not very popular among Internet users and doesn’t get many  links, which may make it difficult for web crawlers to “discover” it. Adding the site to the Google Search Console and then creating an XML file with a list of all the pages you want to appear in the search engine is the best way to place the site in the search results.
  3. The website has many “hidden” subpages – some websites do not have an extensive menu and, for various reasons, hide content outside the main categories. For example, categories with archived posts. If these places also hide valuable content, it is worth inviting web bots and allowing them to index this part of the website.
  4. The website vanished from search results – sometimes a  website is de-indexed and does not appear in Google. These types of mistakes happen even to the biggest players – sometimes all it takes is accidentally changing the robots tag to “noindex” to erase the site from the SERPs. In such situations, the speed of reappearance in search results will be important, and a site map (especially if it was not there before) will allow a quick return to previous positions.

For small sites, a sitemap is not necessary – Google will do well with crawling 5-10 tabs. Larger sites, especially those updated frequently, should take care of proper preparation of the file and making it available for bots.

How to Create a Sitemap

Preparing the sitemap.xml file is relatively simple, however, doing it manually may be laborious. Fortunately, most CMS generates maps independently or enables installation of a dedicated plug-in.

Note – the most popular URL where sitemaps are placed is 

www.your-website.com/sitemap.xml. 

though it is not mandatory. Some automatic audits will consider the lack of the sitemap at this address as an error and inform about missing XML file. However, you should not worry about it. The address where the sitemap is placed can be specified in robots.txt file.

How to prepare a sitemap on a website without CMS? The easiest way is to use one of the generators available online. Before placing it on the site, take the time to review the generated file – the generator, as well as web crawlers, may have problems with finding pages which are not linked in the menu or other places on the site. If they are not in the file, add them manually.

The size of sitemap.xml matters – the maximum size of the sitemap.xml file is 50MB. You also shouldn’t exceed the number of addresses per one map – here the limit is 50 000 records. In cases when the domain consists of many thousands of subpages it is worth splitting the file into several smaller ones and creating a map index. Such solutions are used  in WP plug-ins (see below) and store engines, which automatically generate whole sets of sitemaps. Creating an index with smaller maps will allow you to bypass the problem with possible limits and keep a clear file structure.

How to Create a WordPress Sitemap

As with most changes within WordPress, also in this case – there’s a plugin for it! Or actually… a whole range of plugins offered by the most popular SEO plugin providers – Yoast and All-in-One SEO.

Both plugins generate several sitemaps, and directories with the files are available at the following addresses:

  • YOAST: /sitemap_index.xml
  • All in One SEO: /sitemap.xml

You can add and remove map components in the administrator’s panel in sections concerning maps and indexation.

Sitemap Do’s and Don’ts

Creating a sitemap and making it available for indexing robots is supposed to show strong elements of the website and encourage their placement in search results. With this in mind, it’s worth taking care to keep your sitemap up-to-date and to include information that leads directly to the desired pages in the index. You should avoid:

New call-to-action
New call-to-action

placing the addresses which return error 404,

placing the addresses which return error 301,302,

non-canonical addresses (which address differs from the address contained in the canonical tag).

Adding such addresses may cause problems with proper evaluation by Google. No wonder – having to go from point A to point B also frustrates us and lowers our trust in those elements, through which we were directed.

It is also not worth placing in the sitemap file addresses blocked for web crawlers (with the <meta name=”robots” content=”noindex” /> tag or blocked in the robots.txt file

I have sitemap.xml, what next?

Creating a sitemap.xml file and uploading it to your server is not the end of providing clues for web bots. The map also has its own special place in Google Search Console.

In GSC, go to the “SITE MAPS” tab located in the menu on the left. In the tab, add the map address – paste the URL itself, without the domain address – and click “submit”.

After a moment, the map address will be placed in the “Submitted sitemaps” table, where you’ll also find detailed information about the map – date of submission, last readout, map status and number of detected addresses.

Clicking on the map address will take you to the next screen:

Click “See Index Coverage” for more details:

This is where you’ll find all the information about the indexing of specific subpages, as well as information about problems Googlebot encountered.

Sitemap.xml – what else is worth remembering about?

  • Creating a sitemap is not the end – you should update it from time to time, especially after making significant changes to the addresses or structure of the site:
  • implementing SSL certificate and switching to https address forces changes in all addresses on the site,
  • redirecting or removing individual addresses – it is enough to change their URLs or remove them from the sitemap.xml file
  • complete redesign of the website.

Updating sitemap.xml is also one of the actions worth taking after keyword optimization.

Sitemap.xml may not be as colourful as its classic cousins, but it’s definitely worth using!

New call-to-action
New call-to-action

SEO glossary – over 300 positioning-related terms

   |   By  |  0 Comments

SEO glossary

SEO glossary – introduction

Welcome to our glossary of over 300 terms on website optimization and positioning. It’s probably the largest set of short definitions used by SEO specialists and agencies to navigate the world of Search Engine Optimization.

SEO glossary – A

Above the Fold (ATF)

Content the user can see without scrolling. The user decides whether to scroll down, click next, or return to the search results. Content presented here should motivate the visitor to stay on the page longer.

Some proven tips on above the fold content:

  • communicate exactly what you offer and what to expect after scrolling down,
  • provide important content and messages for the visitor,
  • landing pages should include a call to action (CTA).

Absolute URL 

An address used for internal linking and designating the absolute path to a file, HTML page, image, or another form of content that you want to link to.

The link is used as part of the anchor tag and consists of a protocol, domain name, possibly a subdirectory, and the name of a file, document, HTML page, image, etc.

The absolute URL is text enclosed in quotation marks.

<a href=(https://www.verseoads.com/blog/ “Verseo blog”</a>

A / B Testing

Tests of the website or campaign settings, aimed at comparing two variants and based on the behavior of users, deciding which of them works better.

Accelerated Mobile Pages (AMP)

Websites adjusted to mobile requirements. AMP trends should be followed by companies that want to speed up loading time on mobile devices.

AdSense, Google AdSense

Google advertising program that allows displaying text ads, banner ads, or video ads on websites.

AI – Artificial Intelligence

A system that simulates human intelligence when performing tasks. It improves itself based on processed information, often providing more optimal choices or results than a human. Systems based on AI or machine learning can make many changes in a second and work 24 hours a day. This allows for better optimization of advertising campaigns or improvement of the data quality provided by SEO tools.

AJAX – Asynchronous JavaScript and XML

Programming that allows a website to send and receive information from a server to dynamically change the page without reloading.

Algorithmic Filter

A search engine filter determining whether or not to impose a penalty on a website score. This filter is relatively difficult to detect: check the website statistics for a sharp decrease in the number of visits and phrases in the TOP10 search results.

Algorithmic Penalty

Affects website ranking. It’s relatively hard to spot without the help of an SEO specialist. To resolve an algorithmic penalty check Google’s guidelines.

Anchor Text

A linked word or words that point to a specific page to provide people and search engines with information about the linked web page. Anchors are placed between the <a> tags in HTML code.

Answer Box

The place where you can ask a question or enter a phrase you are looking for. The answers will appear in the search engine.

Alt Attribute

HTML code that provides information to search engines and screen readers (for the blind and visually impaired) to understand the content of the image. 

The alt attribute is the HTML attribute used in HTML documents to specify the alternative text. Adding image descriptions is a good practice that allows search engines to “see” what is on the image. By using keywords in the alt attribute you can optimize your images for Google Images.

Authority Site

A website trusted by users and industry experts. They publish reliable information and point to reliable sites. The content posted here is of high quality and contains up-to-date information.

SEO glossary – B

Backlink

A link from another site that points to our site. One of the activities performed by an SEO specialist is getting backlinks from good sources. Acquiring backlinks is a sign for Google bots that the site, to which such links point, is a valuable source of information. The SEO effect works both ways – when someone posts a link pointing to your site, or when your site links to different websites.

The number of backlinks is one of the most important factors. It determines the website’s position in the ranking. What is important is the link sources and their respective profile.

Backlink Pages

Websites created for ranking a given website. Content focuses on topics that are related to those discussed on the positioned page. Thanks to creating backlink pages we obtain reliable links leading to our website, which are perceived by web crawlers as positive and unobtrusive.

Baidu

The most popular search engine in China, with an over 60% share of the Chinese market and around 1.37% in the world market. The Baidu search engine offers almost the same services as Google.

Bait and Switch

This SEO phrase refers to a black hat SEO technique that uses specific content on a webpage to rank high in SERPs. After getting a satisfactory position, the content of a given website is swapped (usually for content that would not be able to get such a good position).

Ban

The most severe form of Google’s penalties imposed on a non-compliant website which violates the guidelines for webmasters. The domain is removed from Google, and the number of indexed subpages is 0.

Banner Blindness

People’s tendency to ignore elements on websites that resemble advertising content or are advertisements. This applies to an increasing number of internet users. 

Basic Keyword

A keyword with the greatest potential to attract traffic to a website, so it’s really important. When creating website content, it’s worth optimizing it for a given keyword. It should be used in the page title, content header, and first paragraph.

Bing

Search engine owned by Microsoft, with a global market share of 3.18%. Implemented in Microsoft Edge. Read more: search engines alternative to Google.

Black Hat SEO

Practices used to improve the ranking of pages that go against the guidelines of search engines designed to manipulate the search engine and tend to ignore the human audience and their experiences. Black hat SEO activities include keyword stuffing, cloaking, paid links, over-optimization, posting hidden texts, and other changes that conflict with the guidelines.

Blog

A blog is a type of website in which content is organized into categories to allow readers to quickly find a review. A business blog is a great source of the traffic to your website. If you are investing in SEO, the blog will probably be one of the solutions suggested by an SEO specialist. The search engine changed the rules of the game. Several years ago, we asked our friends for recommendations. Today we look for answers on blogs.

Blog Commenting

Blog commenting has been a very popular link-building tactic where users leave a comment with a link to a website that corresponds with the content of the article.

Currently, this tactic is the focus of debate, because it was overused and people were spamming blog articles, leaving links and irrelevant comments everywhere. Comment sections now add nofollow attributes to links. Google also put an end to this practice by penalizing links from irrelevant websites and reducing the number of links shared by pages with huge amounts of links. Even so, it’s still worth commenting on. Leaving a valuable comment on a blog article can help build relations.

Bounce Rate 

An indicator of how many users visited your site and then immediately exited the site without taking any action or browsing any other pages.

The lower your bounce rate, the better. A high bounce rate (over 70%) is a sign that something on the website is not working properly, e.g. the page is loading too slowly, the landing page is unresponsive, poor quality content, there are too many ads, users cannot find your website after typing the relevant keyword, etc. It’s worth talking to an SEO specialist about how to fix the problem.

More about bounce rate: Bounce rate – What is it?

Branded Keywords 

Keywords and phrases that contain the name of the company or brand and its variations. SEO specialists usually do not advise ranking for phrases including brand name (e.g. Nike shoes, Apple computer), because these are extremely competitive phrases and the budget should be allocated to best converting queries.

Breadcrumbs

A type of additional navigation that helps the users track where they are on a website. Breadcrumbs trail appears usually at the top of a page. It shows the path from the current page back to the homepage.

Example: home> rooms> bedroom> beds

This is beneficial for both user convenience and indexing as it provides an improved internal link structure for large sites.

Broken Link

A link that doesn’t work – due to a wrong URL or a linked page that has been removed. A broken link often happens when a page has been moved or renamed. This can have a negative impact on SEO and lower your website ranking on search engines, stopping bots from crawling your site.

SEO glossary – C

Cache

A storage location where data is saved so that it can be accessed as quickly as possible. Thanks to the cache, many elements of our website can load much faster, without the need to connect to the server to download data.

Call-to-Action (CTA)

CTA usually refers to a button designed to prompt an immediate response. When designing advertising banners or landing pages, it’s worth placing a button that will convince visitors to act in a specific way, eg. Buy now, Register, Write now, Call, Check.

More about CTAs: Call to Action – What is it?

Canonical URL

If you happen to have duplicate content on your website, web crawlers will not know which one to show in search results. Canonical URL allows you to specify which page variant is the canonical one: the variant that you want to show in search engines.

Enter a canonical tag (rel = “canonical”) in the snippet of code that looks, for example like this <link rel = “canonical” href = “www.verseoads.com” /> on each page of the copy. This is the job of an SEO specialist who, when analyzing the website, will certainly take into account the issue of duplicates.

Clickbait

Content (in various forms) that appeals to your emotions and curiosity and encourages you to click. Using clickbait can increase traffic to your website. Nevertheless, it’s worth considering the content and form, so that it’s consistent with the brand communication,

Cloaking

A black hat SEO practice when Google crawler is presented with different content than the user visiting the website.

Competitor Analysis

Assessing competitors who are trying to rank for the same or similar keywords as your website. Competitor analysis helps you identify the tactics that work best for your business and the areas that should be prioritized.

Competitor analysis in SEO will allow you to:

  • discover keywords that you can also use,
  • discover new opportunities for backlinks,
  • get additional ideas for content,
  • get ideas for optimizing titles and content descriptions 

Content Delivery Network (CDN)

A large, distributed system that delivers content to multiple data centers and Internet traffic exchange points.

Content Marketing

Content Marketing is a process and a long-term strategy that assumes the regular delivery of attractive, valuable, and closely tailored content. Content is used to build your audience, that is a group of recipients who will regularly return to the source for new information. The primary goal of content marketing is sales, but all content tactics are considered inbound marketing that means, according to Hubspot, activities that enable recipients to independently find the sender of a given advertising message.

Content Management System (CMS) 

CMS allows users to edit, update and maintain content on the site without requiring any developer support. CMSs are easy to learn and configure. One of the most popular CMS examples is WordPress.

In terms of SEO, CMS systems are not recommended, although they make it much easier to enter and modify the content on the website, it’s much more difficult to introduce any changes in the code and page settings that would affect its position in the search engine. Nevertheless, there are currently many SEO plugins that help optimize the content e.g. on WordPress.

Content Pruning

The process of analyzing and assessing the content on the website and removing pages that are of little value to your audience and the website. This doesn’t always mean deleting it but simply preventing bots from indexing it.

Content Spinning

SEO technique of rewriting the same article using synonyms or changing the order of words in a sentence. This creates other articles on the same topic. It’s done in the hope that Google won’t mark them as duplicates. So it’s a risky SEO practice.

Content Syndication

Also known as article syndication, syndicated content;

A method of republishing the same content on different websites, so that it can be read by new audiences. This will increase traffic and build links.

The risk, however, is that a different site may rank better in search results than your site (which has first posted the article) if it has more authority. So it’s worth considering all the pros and cons before publishing.

Conversion 

A user performing the expected action. For example, signing up for a webinar, downloading an ebook, clicking a link in an email, or interacting with any other CTA.

Conversion Rate

The percentage of users who completed the desired goal (conversion) out of all users who saw your ad, visited your website, got an email, saw a call to action.

How do I calculate my conversion rate?

it’s enough to divide the number of conversions by the number of clicks on the ad (or other activities we wanted).

Copywriting

Writing texts for publication, websites’ descriptions, and advertising slogans. The copywriter provides texts for the needs of e.g. TV and radio spots, internet creatives, descriptions for websites, advertising brochures, leaflets, and blog entries, etc.

Core Web Vitals

Core Web Vitals can be described as indicators that are designed to gauge the user experience when it comes to page loading, visual stability, and availability during and after rendering.

There are three Core Web Vitals:

  • Largest Contentful Paint – LPC
  • First Input Delay – FID
  • Cumulative Layouts Shift – CLS

More about CWV: Core Web Vitals – What Are They and Why Are They Important?

Crawling

A process performed by web crawlers during which the website’s content and code are analyzed. Crawling is an activity during which bots index all internal and external links of our website.

Crawlability

The process of enabling crawlers to easily navigate our site, understand it, effectively search for content and add the page to its index.

Crawlability can be improved by, for example, creating XML and a sitemap, optimizing page speed, images, and videos, and using redirects properly.

Cross-linking

The process of combining several domains belonging to the same company or person. It’s not a problem if the websites are thematically related and do not have many links.

CSS – Cascading Style Sheets

The language that defines the appearance of an HTML document. Each page has one or more CSS files, where each element has a specific size, color, layout, etc.

CTR – Click Through Rate

The indicator of the percentage of people who clicked on your ad/ website after it was displayed in the search results.

How to calculate it?

Just divide the number of clicks by the number of impressions and there you have your CTR!

SEO glossary – D

Data Highlighter

A tool available in the Google Search Console used to teach the search engine which information on the website should be treated as structural data so it can be presented more attractively in search engine results.

Dead-End Page

A website without any links, a dead end for visitors and web bots, giving them no choice but to leave the site.

This can be easily fixed by placing navigation links in the header, footer, navigation menu, or links suggesting users go to the next step – next page.

De-indexing 

The process of removing a website from the search engine index, preventing it from appearing in search results.

To do so, add a “noindex” metatag to a website.

Deep Link

Link on an external page that leads to a subpage rather than to the website’s home page.

Depositioning

Activities aimed at lowering the position in the search results of the selected domain or individual subpages. It can be negative (unethical) or positive (getting rid of outdated information about a company that managed to rank high in search results and does not allow current information to rank higher).

Directory Submission

A procedure used by SEO specialists to add a website to different directories under a particular category to get backlinks. It’s important to use unique page descriptions and titles that most often become the anchor of the link. Of course, the website should be added to directories that are thematically related and correspond to the content of the website.

Direct Links

Links that point directly to the target website and may affect its position in search engine results (if specified with the dofollow attribute).

Direct Traffic

A number of visits to your site during which a visitor has entered a web address (URL) in the browser.

Disavow Backlinks

Backlinks that are not beneficial to your website, for example, from unrelated and untrusted sources. They are considered bad and may result in a penalty or have a negative impact on the ranking of the website.

Dofollow Links

All links are dofollow links by default, which means they divert traffic to the linked page and allow crawlers to follow them, helping place the linked page in the SERP.

Domain

The location of the website is usually displayed as a business name or IP address. It’s an element of the internet address that directs internet users to the desired page.

Domain Age

One of the factors taken into account by Google when evaluating a website. The age of the domain is counted from the moment it’s registered, and the longer it’s in service, the more valuable it’s from Google’s point of view.

Doorway Page

Also known as a jump page; a black hat SEO method which refers to creating a website and ranking it for selected key phrases, using techniques that do not comply with the guidelines for Google webmasters. The user visiting the Doorway page is automatically redirected to the customer’s landing page.

DuckDuckGo

An alternative to the more popular search engines; a search engine that emphasizes user privacy and does not use personalized search results. The users are not profiled, therefore all users get the same results for a specific search query.

More about DuckDuckGo: read an article about alternative search engines.

Duplicate Content

Content that appears on more than one website (on the same domain or different pages). When the search engine crawler finds duplicate content, it will not be able to determine which page is more important. Duplicate content is a common issue if we have both a mobile version of our website and a desktop version. In this case, we have to select a page that we will identify as original and add a code snippet (rel = canonical) to all others to tell search engines which page to treat as the original.

Dwell Time

Time visitors spend on your website after clicking its link in search results. The longer the better, as it indicates that the visitor has found relevant and engaging content.

SEO glossary – E

Email Outreach

Also known as link outreach, blogger outreach

An important part of the linking process. It refers to the possibility of acquiring backlinks by contacting the author of the page to obtain a link leading to your page. Through email outreach, we can obtain links from:

  • pages with a link in an article that corresponds to the topic of one of your web pages,
  • websites that are relevant to your niche, with a small number of outbound links and the possibility of including you in the content as e.g. a co-author,
  • website for which you can write a guest post,
  • businesses or people that you mentioned in your content,
  • businesses and people that mentioned you on their website.

Error 200

HTTP code which means that the query sent to the server was successful and the user was presented with the content of the requested document.

Error 202

One of the HTTP response codes that informs the users that their request has been forwarded to the server and will be handled, but has not yet happened.

Error 404

Appears when the browser was able to communicate with a given server, but it’s not able to find the subpage that the user was looking for.

Error 503

An error message that indicates that the server is overloaded. It usually disappears when the page is refreshed, if not, it probably means the page exceeded the load limit set by the hosting company, e.g. the CPU of the reference server or memory.

Exact Match Anchor Text

Occurs when the anchor text accurately describes the subject of the webpage it leads to.

Evergreen Content

Content that is always up to date – The topics are always relevant to readers, regardless of the date or season. Read more about evergreen content in an article: Evergreen Content – The Holy Grail of Content Marketing Strategy

Exact Match Keywords

Google keyword match type. Your ad will appear in search results only if the user enters the word or phrase in the given order.

Exact Match Domain (EMD)

A keyword entered in the search engine that exactly matches the domain name, e.g. “room wallpapers”, www.roomwallpapers.com

External Duplicate Content

Duplicate content created on different external domains, e.g. by copying:

  • descriptions of products from their manufacturers;
  • text from competing sites;
  • own content for internet catalogs,
  • content from your website for publication purposes, e.g. on social media or other platforms.

External linking

Linking from external websites to our website. It aims to obtain links from subpages of websites of high quality and authority.

SEO glossary – F

Favicon

A small icon, most often in the form of a logo (16×16 pixels) that represents your brand, company, or website. It’s displayed in browser tabs, search history, and bookmark lists. Favicon has no direct impact on SEO but improves usability, which in turn increases user engagement rates (such as page views, returning user, conversion rate, etc.) which are the ranking factors.

Featured Snippet

Also known as a direct answer, answer box

After entering the search query, the first answer in the results may be presented in a box. It’s a quick answer or summary with a content snippet from a relevant website.

By using structured data, we can help Google understand our content better, and this can help us rank our content as a featured snippet. Google’s algorithm determines which site best answers the question and displays this site.

Filter

A method of attributing rank penalties imposed by Google on websites that use unethical positioning techniques to manipulate The website’s position for selected keyphrases drops, or it’s completely removed from search results as a result of applying the filter t.

There are two types of filters imposed by Google:

  • manual filters
  • algorithmic filters.

Footer Link

A link at the bottom of the page. These links may lead to specific categories on the website and external websites.

SEO glossary – G

Geolocation

Presents local results of the query based on the location of their IP address. Typing in “vegetarian restaurant” while being in London will provide a list of restaurants that are in London.

Geotargeting 

Method of targeting your ad to users based on their location.

Google

The most popular search engine in the world with a market share of 90.91% and loads of additional services and tools.

From the SEO point of view, we can use the following tools:

  • Google Keyword Planner
  • Google Alerts
  • Google Analytics
  • Google Maps
  • Google Mobile Optimization Test
  • Google My Business
  • Google PageSpeed ​​Insights
  • Google Search Console
  • Google Trends
  • Google Webmaster Tools (Google Search Console)
  • Youtube

Googlebot

A web crawler program whose task is to automatically find, analyze, and index websites on the Internet (both on the computer and in the mobile version). It detects both new and updated websites by following the links on the websites and thus reaching other new websites.

Google Ads

An advertising system created by Google for building and managing advertising campaigns. Creates ads and leads users to our website, encouraging them to buy or use the service.

Includes various types of ads, such as e.g.:

  • text ads,
  • Google Display Network ads,
  • Google Shopping ads,
  • Youtube ads.

Google Ads and website positioning are two strongly supporting channels, the use of which guarantees the best results and an increase in business visibility in search results, and thus sales increases.

Google Algorithm 

Programs used by Google to rank websites presented in search queries. These are subject to frequent updates. The most important updates released over the past few years are: Panda, Penguin, Pirate, Hummingbird, Pigeon, Mobile Friendly, RankBrain, Possum, and Fred.

Google Analytics

Google’s web analytics tool provides precise data about a website. After adding a tracking code to your website, you can track and get detailed data about the traffic on your website.

Google Autocomplete 

A search function that suggests and completes the typed phrase when the user enters it into the search field.

Google Knowledge Panel

The panel contains basic information about the user’s query. Most often it appears to the right of the search results.

Google Mobile-Friendly Test 

A set of test procedures aimed at verifying a website’s performance on mobile devices. In recent years, Google has been paying more and more attention to mobile-friendly websites as there are many mobile phone users. Google is setting high standards when it comes to user experience and mobile-friendliness has become an important ranking factor.

Google PageSpeed ​​Insights

A Google tool designed to analyze the content of the website to determine its speed. It contains tips on what should be improved to speed up the website’s loading time.

Google Search Console

A free Google tool that allows you to check the status of the page in search results, analyze incoming and outgoing links. It also contains information about keyphrases and indicates page indexing errors.

Grey Hat SEO

SEO practices that are technically allowed but ethically questionable. One of these practices is buying or selling links. While almost impossible to detect when purchased in your niche, it could result in a penalty. The same goes for buying “likes” and “shares” on social media.

Guest Blogging

Verseo Ads Banner
Verseo Ads Banner

Posting content on blogs that are both attractive to readers and effective in terms of SEO. Such entries posted on high-quality thematic blogs strengthen the position of a given website in search results.

SEO glossary – H

H1 – H6 – HTML Header Tags

Header tags are used in HTML to create structure. They are used to mark titles and subtitles or underline text. There are six levels of headings in HTML – H1, H2, H3, H4, H5, and H6. The higher the level, the more important the headline. It’s the easiest way to tell both our readers and web bots what the content hierarchy is. H1 defines the most important heading on the page and there should be only one such tag per page.

Hidden Text 

Content that is in the code but is not visible to the user. In the past, this method was used to manipulate rankings, mainly by keyword stuffing, which is why it’s classified as a black hat SEO technique.

Examples:

  • text that matches the color of the background,
  • text behind the image or of-screen, placed with CSS,
  • use of CSS to reduce the size of the element that contains the text and hide the overflow.

Hyperlink

Text or an image that, when clicked, gives direct access to different pages or a specific element within the page. It enables easy web navigation.

Hreflang

The Hreflang attribute is used to mark pages on a multilingual site. By using this attribute, you can be sure that your visitors see the content on your site in the language they used for their query. Hreflang can come in handy in terms of duplicating content. For example, a site that has the same content for different locations (e.g. Polish, German or English version), with differences such as currency. This way you tell Google for which localization or language a given version of the website is optimized.

Homepage 

The main page of a website is displayed when a visitor enters a web address that contains only the domain name. It’s also the most used page. The lower-level pages are called subpages.

HTTP – Hypertext Transfer Protocol

The main network protocol allows users to display and receive information. From the SEO point of view, it’s better to use the secure HTTPS, especially for online stores where HTTPS is required.

HTTPS – Secure Hypertext Transfer Protocol

An encrypted, secure version of the HTTP protocol that keeps information safe from hackers. Google recommends this HTTPS protocol, offering some benefits to web pages using it, such as e.g. better rankings and data on referring traffic.

Read more about HTTPS in an article: TLS/SSL Certificate – What is it?

SEO glossary – I

Image SEO 

Optimization of all images on a website to increase content relevance, search engine visibility, and accessibility to screen readers.

You can position images by:

  • choosing the right file size so as not to slow down the loading time,
  • using descriptive file names,
  • adding alt tags with picture descriptions,
  • creating image sitemaps.

Image Sitemap

An XML file that contains all the images on your site that you want to index. This can be a separate file or added to an existing XML sitemap.

Indirect Links

Links that point the user to a different website, through a 301 or 302 redirect, so that the user and web crawlers are first directed to the desired website by leading them through their domain.

Inbound Links

Links that point to our website. These links were obtained naturally, e.g. a different website recommended our product or service.

Inbound Marketing

According to Hubspot, these are all activities that enable the recipients to independently find the sender of a given advertising message. Inbound marketing builds awareness, attracts new clients with channels like blogs and social media.

Index

A search engine database that contains data about all websites that a user has access to. If a website is not included in the index, it will not be shown in the search results. To index means to add a website to the search engine database.

If you want to speed up this process, you can:

  • check for the “noindex” value in the robots HTML tag,
  • submit an XML sitemap that contains the missing page,
  • get backlinks from sites with high authority because they are more frequently indexed.

Indexing

Indexing is done by network bots. The process involves collecting information about the content, key phrases, links, and images posted on your website and saving them in a database of the search engines. This way, each time users use a search engine, they research that database instead of looking for all the pages on the Internet.

Internal Duplicate Content

Duplicating content within one domain, e.g. presenting many products with the same or very similar description.

Internal Linking

Links to other parts of the text or other subpages within one domain.

International SEO

International SEO aims to increase the visibility of your website in different countries.

After setting up language and country detection on the website, correct content will be displayed automatically. Examples of international SEO tactics:

  • optimization for the most used search engine in the selected country,
  • hreflang, which will determine the language of the website’s content and help reach the target group,
  • translation of the HTML elements of the website, eg URL, meta description, and title
  • translation of website content, including navigation, image names, and tagging,
  • use of country detection to match the currency.

Interstitial

A type of intrusive advertising e.g. pop-ups, banners, or overlays, which cover most of the content and make it difficult to use the website.

SEO glossary – J

JavaScript rendering

An option that allows search engines to search and understand the content on a website that is often dynamically generated using javascript code.

The content of JavaScript on a page can vary from simple style changes to deeper integrations that modify the structure of the page with specific data.

Google is one of the few search engines that supports JavaScript rendering because rendering JavaScript content on a page is a difficult task. The script must first be written and processed, and this can take a lot of time and resources for a bot. Additionally, it often doesn’t work properly because JavaScript doesn’t work as a standard or typical HTML page.

JSON-LD

A data format, dynamically implemented in the form of JavaScript code or embedded as widgets in the content management system.

SEO glossary – K

Keep It Simple, Stupid (KISS)

The famous principle, especially in the business world, states that it’s best to describe various issues in the simplest and user-friendly way possible, so as not to complicate the message.

Key Performance Indicators

Indicators allow measuring whether the goals set at the beginning have been met. These indicators may relate to business as well as marketing and financial issues. In the case of SEO, our KPI can be, for example, the number of visitors to the website or the bounce rate.

Keyword

A search query entered by users in search engines to find interesting things/ products/ services. Knowing keywords for your offer will allow you to include them in the content on the website, and thus gain a better position in the search engine.

Keyword Analysis 

The process of finding all the keywords a website ranks or can rank for. When conducting keyword analysis, it’s worth considering the number of searches, CPC, competition, and current position in the SERP.

Keyword Cannibalization

Occurs when several subpages of one website appear in the search results for a given phrase or keyword. It can be problematic for the website owner when a less significant subpage obtains a relatively higher position and than a more important (from the business point of view) subpage. For example, when the home page ranks higher than a specific product page.

Keyword Categorization

The process of classifying and grouping keywords based on the user’s intentions at the time of the search. Keyword categorization can help with determining keywords that users search for at different stages of the shopping path.

Keyword Competition

Indicates the popularity of a given keyword among web pages that rank for it. Keyword competition is an important indicator of how easy or difficult it will be to rank for a particular keyword in the search results.

Keyword Density

The percentage of times a key phrase appears on a given website. Too many keywords in the text may cause the website to be considered spam and lower its position in the SERPs. The density and number of keywords in a given text should be determined depending on the topic of the text and the industry in which we operate.

Keyword Frequency

The total number of keywords used on the website. We measure the frequency of the keyword, however, the amount of content on the website is irrelevant in this case.

Keyword Funnel

Refers to using different keyword searches by the customers depending on where they are in their purchase funnel.

A customer who wants to buy a given item will enter different terms than the one who just wants to learn about the products. We can categorize given keywords according to these stages and optimize the specific landing pages to increase conversion.

Keyword Positioning

Distribution of keywords or phrases in the text. According to good practices, we should include them in the title, meta tags, headers, as well as in the entire content of the page, taking into account the density of their use and position.

Keyword Prominence

An indicator of how important or “noticeable” a given keyword is. Of course, the most important place will be at the beginning or near the beginning, e.g. in titles, headings, meta descriptions, or the initial fragment of the text.

Keyword Proximity

Refers to the distance between keywords within a body of text. For example, when you have a longer phrase, such as the Content Marketing Guide, it takes into account how often the phrase appears in the same form. If the word order matches this phrase, the website is perceived as more relevant.

Keyword Rank

Position of your website in the search results for a given keyword.

Keyword Research

The process of properly matching key phrases to the website. It’s crucial when carrying out positioning, also when preparing content friendly not only to users but also to search engines. Keyword research can be carried out manually or using dedicated tools, such as Google Ads Keyword Planner.

Keyword Stuffing

The practice of adding an excessively large number of keywords in the text in the hope that it will guarantee a better ranking in the search results. This is a practice not recommended by SEO specialists, because it usually results in a decrease in the credibility of the website for web bots, and thus in a drop in the position.

SEO glossary – L

Landing page

The page where the user who clicked on a link is directed to. Most often a landing page is a subpage created for a specific purpose, e.g. subscribing to a webinar, downloading an e-book, or purchasing a product. However, it can be any page that we decide to direct traffic to.

Latent Semantic Indexing (LSI)

LSI keywords are words and phrases that are semantically related to the webpage’s target keyword. These words have the same context and are often used together. An example of an LSI keyword for “SEO” can be an SEO audit, SEO tools, SEO phrases, etc.

Lead Magnet

Content intended to encourage potential customers to leave an email address or other contact details. An example would be offering free copies or valuable content in exchange for contact details, e.g. email. The email lists collected in this way can then be used in further marketing activities.

Link Baiting

A method for getting quality links without paying for them. This method is often used by Content Marketing, where valuable material is created, so it could be linked by the Internet users,

Link Building

A series of activities aimed at providing our website with valuable inbound links. When it comes to link building it’s advisable to contact an SEO specialist, because the method of linking and the selection of pages from which we want to obtain links is extremely important in the entire process of website positioning and can affect a page’s position.

Read more about link building in an article: Building Backlinks.

Link Burst

Occurs when a website gets a large number of backlinks in a very short time. This is not desirable because it’s usually a sign for web crawlers that the links were not naturally obtained.

Link Diversity

A link building strategy where we collect several links from different types of websites (blogs, news, directories, social media) and different domains (.com, .eu) using the attributes “nofollow” and “dofollow”. Many experts believe that it’s important from the SEO point of view to diversify the sources of links and thus create their natural profile.

Link Equity

The term describes the process of passing authority from one page to another through the use of hyperlinks.

Link Exchange System

A system that allows you to automatically obtain a large number of links leading to a given website in a very short time, using the websites of publishers belonging to a given system. The use of these systems is not recommended and well received by Google, so it may be a potential factor in reducing the value of a website in search results.

Link Farm 

A collection of websites that link to each other or one specific page. According to the guidelines, this action is inadvisable. Google penalty may be imposed.

Link Hoarding

A tactic used by some websites trying to collect as many inbound links as possible while keeping the number of outbound links as low as possible. It’s an unacceptable practice.

Link Popularity

A total number of backlinks on your site. Each backlink is counted separately, even from the same site.

Link Profile 

The makeup of all links leading to your website. We can analyze the link profile based on:

  • sources of obtaining links (forums, blogs, catalogs, other websites),
  • types of link anchors
  • link types (text, graphic, etc.),
  • type of links (dofollow, nofollow, etc.).

Link Reclamation

The process of recovering lost links.

Link Relevancy

Determines whether the content published on the website from which we obtain the link is consistent with the content of the website to which the link leads.

Link Rot

All the links that are broken and may not be working anymore or the pages they came from have already been removed. This can happen if the website owner does not conduct regular audits and does not check the link profile from time to time.

Link Scheme

All activities aimed at obtaining incoming links that do not comply with Google’s guidelines, such as buying and selling links, exchanging them on a large scale, or too many sponsored articles on the website.

Link Spam (blog spam, comment spam)

Posting out-of-context links in the comments on various blogs or internet forums to get as many links as possible. Currently, most links are added this way with the nofollow attribute, so it’s not a profitable practice.

Link Title

The link title attribute gives additional information about the link. However, this title has no direct relevance to SEO.

Link Velocity

The speed of link growth to a webpage. It’s important in terms of SEO how fast or slow you develop your backlink profile. The more natural it seems to be obtained, the better.

Linkbait

Valuable content that other websites will naturally lead to, containing information that is unique and high-quality gives viral spread potential. Difficult to achieve, but extremely beneficial for SEO.

Local Search 

A search method that shows companies that are closest to the location of the user entering the search query (usually brick-and-mortar businesses or service providers) in the search results.

Local SEO

Activities aimed at ranking the site for search queries related to the location, for example, after entering “waffles in London” in the results you will see restaurants and cafes offering waffles in this city.

Local Traffic

Users who found our website as a result of positioning activities based on local phrases – most often focused on the same city or province. In this case, geo-targeted key phrases and Google Maps listings can come in handy.

Long Click

Occurs when the users, after entering the phrase in the search engine and finding the appropriate link, go to the page and no longer return to the search results, because they have probably found what they were looking for.

Long Tail

A phrase of a few words (min. 3-4) that indicates specific (not general) features of the query. It’s, for example, the name of a specific product or model with parameters or exact features. The so-called long tails provide more information about the needs of your potential customer and the desired features of the product or service. Moreover, such phrases often have a much higher conversion rate than more general phrases. Most often because, by entering such a specific name of the searched product, the user has already heard about it, was interested in it, and now wants to buy it.

Read more about long tail keywords: Long Tail Keywords – What Are They?

SEO glossary – M

M-dot Domain

A version of the website dedicated specifically to mobile users. The name of the domain starts with the letter m before the dot and the entire website address (m.example.com).

Machine Learning

The ability of a system to learn certain actions based on a huge amount of data. The system learns the given schemes and solutions and as a result, it can improve the operation of a given service or product.

The proprietary VCM system (Verseo Campaign Manager) can adjust the rates in Google Ads based on a huge amount of data from the campaign; the system can optimize them in such a way that it brings the most profitable results by monitoring campaigns 24/7.

Manual Filter

Type of filter imposed by Google. Fortunately, we are notified about its imposition. Therefore, we can easily check in Google Webmaster Tools (Google Search Console) whether such a filter has been applied. If so, we will also find information on whether the filter was imposed on all or only part of the site and the reason for the penalty.

Meta Description

A description of the page content that will be displayed in Google search results. On its basis, Google assesses whether the website responds to a user’s query and whether it should appear in search results.

Meta Tags

Tags placed in the head section of the page that inform the web crawlers about the content of the site. They are used in HTML and XHTML. The most important attributes are the title, description, and robots.

Meta Keywords

Keywords describing the content of the website used to inform web crawlers about the website’s content. This was frequently overused, so they are currently not considered a ranking factor by Google. Currently, it’s believed that their impact on SEO may even be negative.

Meta Refresh

A method of informing the browser to refresh the page after a certain amount of time. It’s used primarily when there is data on the website that depending on e.g. availability or settings, may change for example the number of products in the store.

Microdata

A type of structured data that allows search engines to better understand the content of websites. With their help, you can mark the different types of content that appear on the page, so that search engine bots can better define it.

Mirror Site

A website that is copied to a different www address. It’s used when the main website generates so much traffic that the server cannot handle it.

Mobile First Index

Index created by Google, according to which websites are also assessed whether they are optimized for mobile. It’s also a ranking factor for Google, so you should consider creating a mobile version of your current site.

Mobile-friendly Website

A mobile-friendly website is optimized for access via mobile devices.

Mobile Optimization 

Any activities aimed at improving the website for mobile devices. Very often it means a significant change of the website to fit the requirements and criteria allowing the users of small screens to freely navigate the website. Fast loading speed and responsiveness always works 🙂 These are the three most important features of a mobile website.

SEO glossary – N

NAP – Name, Address, Phone Number

Important contact details (name, address, and phone number) that should be on the website, it’s most often recommended to put this data in the “Contact,” “About us” tab or in the footer of the site, increasing credibility in the eyes of users and Google bots.

Natural Links

Links obtained naturally, that is without the help of an SEO specialist, e.g. by recommending our website (or part of the content posted) on a different website.

Negative SEO

Negative practices that are aimed at lowering your competitors’ position in search results. These may include: hacking a competitor’s website, spamming, linking from unreliable sites, or creating duplicate content.

Noarchive

Tag that can be put in the “robots” file, if we do not want the bots to store a copy of the page the next time it’s indexed.

Nofollow

A tag which indicates that web crawlers should not follow the links on the site.

Noindex

A tag responsible for blocking the search indexing of the selected subpage. While Google crawlers can still check its content, it will not be indexed and thus will not show up in search results.

Noopener and Noreferrer

Values that ​refer to the connection between a given page with a link and a page that the link indicates.

Not Provided

Refers to hiding keywords in analytical systems such as Google Analytics. These activities are undertaken by Google to protect the privacy of users on the Internet.

SEO glossary – O

Off-page SEO

Any actions taken outside the positioned website. The purpose is to improve the search engine ranking, like inbound link acquisition, PR activities, and strengthening the company’s image on social media.

On-page SEO

Actions performed on the website, which include both technical optimization, in the website code, as well as improving and adding content.

Open Graph Protocol

A protocol thanks to which the indicated fragment of content (text and image) from a given website is displayed after posting a link on social media.

Opera

Web browser created and developed by Opera Software ASA.

Optimization

All activities aimed at improving the visibility of the website on the Internet, and thus increasing sales. Website optimization is designed to improve the site’s quality both in the eyes of web crawlers and users, through activities related to on-page and off-page SEO.

Organic Search Results SERP, Search Engine Results Page

Lists of results that appear to users searching for given phrases in the search engine. We divide them into paid (PPC, Google Ads) and organic (SEO, natural) results.

Organic Traffic

Users who landed on your website via free search results.

Orphan Page

A page that has no direct links. It can’t be found by the search bot, so it’s useless for SEO.

Outbound Links

Links placed on the website that direct the user and search engine bots to a different website.

Over-optimization

Negatively assessed optimization practices aimed at deceiving the search engine and providing false SEO results, without taking into account the role of the users and their experience on the site.

SEO glossary – P

Page Cloaking

Classified as a black hat SEO technique, this occurs when the search bot results present a version of the content than that seen by the visitor. This is possible by delivering content based on the user’s IP address or the User-Agent HTTP header of the user requesting the site. When a user is identified as a search engine spider, a server-side script delivers an alternate version of the page, one that has content not present on the visible page, or that is present but not searchable.

Page Footer

The lower part of the website, where we can find information such as the company’s address, contact details, and a link to important subpages.

PageRank

The algorithm used by the Google search engine; based on the analysis of the number of links pointing to the page and their quality. The PageRank for individual pages is not published.

Page Speed

An important factor that affects the position of the website and its reception by users.

Page Title 

A clickable headline that appears in the search results for a specific website. This is the most visible part of the SERP snippet and has a big impact on your CTR, SEO, and social media sharing.

To set the page title yourself, you need to use title tags in your HTML source code (or use a plugin in your CMS). If your title is too long, the Google algorithm will try to use a different piece of content. Therefore, a page title should stay below 487 pixels for mobile phones and 568 pixels for desktops.

Pageview

A request which occurs each time the user loads a specific page. The number of page views is tracked by the Google Analytics tracking code.

Paid Links

Text ads created in the Google Ads system. Paid links are not part of the SEO environment, which assumes organic growth of the site’s position.

Partial Match Domain (PMD)

A domain name, which partially matches the entered keyword, eg. ceramic pots in the set – ”www.pots-sets.com”

People Also Ask

Most frequently searched phrases suggested by Google, similar to those we entered. “People also ask” most often appears between the search results.

Poison Words

Words that lower the quality of the page in search engines and classify it as poor. These are usually words and phrases that cause distrust on the part of the user and web crawlers.

Posts on Google 

Short posts presenting the company’s life that can be created via the Google My Business panel. You can post content in the form of a photo and a short video. The posts will appear in both the Google Maps listing and the search results.

Positioning

Any deliberate action to improve the visibility of a website on search engine result lists.

Presell Pages

New call-to-action
New call-to-action

Websites with articles with links to the page that we want to rank. Presell pages are low-quality pages with meaningless content. Therefore, it’s not recommended by SEO experts.

SEO glossary – Q

QDD – Query Deserves Diversity

Google ranking algorithms tasked with matching the user’s intent by displaying the results of a broader match.

QDF – Query Deserves Freshness

A Google ranking algorithm based on the latest information about given events. Most commonly used for political, environmental, economic, etc. events (that should be updated)

SEO glossary – R

RankBrain

An algorithm based on the artificial intelligence model. It evaluates new queries and the overall quality of search results.

RDFa

An HTML5 extension that allows you to mark certain types of content on a website. It’s one of the structured data formats allowing search engines to better interpret the content of the website.

Reciprocal Linking

An agreement between two websites to share backlinks. It’s worth having confidence in the given company, as exchanging links with a company that does not act fairly can harm our website’s backlink profile and make it difficult for search engine crawlers to categorize our website content.

Redirect 301

Redirecting a website’s address to a different website’s address permanently. It’s recommended to redirect the website to one of the two versions – “with www” or “without www” or to redirect the positioning power from non-existent subpages to their existing versions.

Redirect 302

Redirecting the address of a website to the address of another website. It’s mainly used when the original page is temporarily unavailable. You can use this redirection for a product page temporarily unavailable in the online store.

Redirect Links

Links leading to a subpage of our website and then are redirected to another address. There are different types of such redirects, which are also covered in this SEO Glossary.

Referrer

A website that leads the user to another landing page. A referrer is sent in the HTTP request header ‘Referer’.

Relative URL

Used for internal linking. This is a short path to the file, HTML page, image, or other content that we link to. This link is used as part of the anchor tag, but it does not contain the protocol and the domain name.

Responsive Website

A yype of website that adapts to the user’s device. A responsive website, depending on the device, can change the arrangement of elements and resolution. Let’s the website remain user-friendly, both on larger and smaller screens.

Rich Cards

A type of Google search result that presents content in the form of images and text. They are displayed as single results or as a carousel. Rich cards are organic results and may include information such as:

  • website name,
  • title of the article, publication,
  • site average rating and the number of votes,
  • cooking time (in the case of recipes)
  • and more.

Robots Meta Tag

The code snippet allows you to control how the search engine crawler will index the content of your website. The default value is “index, follow”, which means that the bot will both index your page and follow all the links pointing from it. There are other values, such as:

  • noindex – stops the bot from indexing the page,
  • nofollow – stops the bot from following a link on the page,
  • nosnippet – the description will not be displayed on the search list (only for Google),
  • noimageindex – stops displaying images from a given page while searching for an image,
  • noarchive – a cached copy of your page will not be stored or displayed in the SERP.

Robots Noindex

One of the possible values ​​assigned to the robots meta tag, which blocks bots from indexing a given subpage. Using “noindex” we give the robot a command not to include our website in the index when analyzing its content.

Robots.txt

A file that is placed on the website server in the main folder containing all commands for web crawlers that visit your website. In this file, you can both allow and deny Google bots access to certain resources of the site. So you can block access to a given type of files, folders, or website addresses.

SEO glossary – S

Safari

A web browser created and developed by Apple, available only on devices with the OS X operating system. It’s the second most popular web browser in the world.

Satellite Sites (also known as PBN – Private Blog Network)

Websites created for the purpose of positioning, containing content focused on topics related to those raised on the positioned page. Creating backlinks provides credible links leading to our website, which are perceived by web robots as positive and unobtrusive.

Schema.org

A website where markup diagrams of available structured data on the web are collected. Thanks to them, web crawlers can quickly and easily understand the content of a page and its meaning.

Scraping

Using an automated program to collect data, usually metadata, from multiple websites. The data you collect may include all page titles of your competitors along with their meta descriptions. This technique can be used in black hat SEO if we download the content to use it on our website. As part of the white hat SEO technique, the data is used for informational purposes.

SEA – Search Engine Advertising

One of the activities in the field of marketing that aims to increase the visibility of the website in search engines.

Search Engine Marketing (SEM)

Any activities aimed at improving the position of a website in search results, which include both paid practices (PPC) and organic (SEO).

Search Engine Reputation Management (SERM)

Any activities aimed at increasing the company’s reputation or image on the web, including e.g. moderating opinions, website optimization, etc.

Search Engine Results Page (SERP)

List of links pointing to websites that appeared after the user entered a given phrase.

Search Engine

A website or program that collects and organizes data and information to search for various types of information available on the Internet. The most popular search engines are Google, Baidu, Bing, Yahoo, and Yandex.

Search Engine Spam

Practice related to black hat SEO, which refers to manipulating the search engine index deliberately to obtain a higher rank in SERP and thus reduce the precision of search results.

Search Quality Rater Guidelines

A collection of information and rules created by Google specifically for people who are responsible for evaluating search results and pages manually to detect spam or illegal content. These guidelines are regularly updated and have been available to the public since 2015.

Search Result Feature

A common name for all the different ways which Google uses to display information in its results. Effect on the search results can have the country we search from, the language, and the device.

Search Result Snippet

A snippet of links displayed on SERPs that show content in your description meta tag. The traditional organic or natural search result consists of a title and description.

Semantic Search

Its purpose is to understand the user’s search intentions so suggestions of websites or products that appear in the search results are more relevant and match the user’s intentions.

Semantic Web

A project the aim of which is to recognize the context and intent of the query that the user submits to the search engine.

SEO Service

A paid service provided by SEO specialists to improve the ranking of our website in search results.

SEO Silo

A method of grouping related content into categories and subcategories on a website to help both people and search engines understand its meaning.

SEO Site Audit 

A full analysis of the website’s visibility in search engines. It should help you locate errors, bugs, and content gaps that you can use in your SEO strategy. There are two types of audits, which you can conduct to gain insight into SEO related issues: technical SEO audit and content audit.

You can find Verseo automatic auditor here:

SEO URL

A user-friendly URL that has been optimized to get the most out of SEO.

SERP Shaker

Black hat SEO technique that uses software to create content and websites solely to rank for low competition long-tail keywords. These sites are then used to generate revenue by adding and creating backlinks, collecting email addresses, etc.

Session

Occurs every time a user visits your page and browses it. A session is counted regardless of whether the person visited your website earlier or not. The session timeout on the web is 30 minutes.

Seznam

A Czech internet search engine and information portal.

Short Click

Occurs when the users click on our link found in the search results and leave the page. They return to the search results shortly after viewing the content.

Site

By entering “site: website domain” we obtain information about all indexed subpages within a given website.

Site-wide Link

The link displayed on all subpages, e.g. on the home page, blog, in various tabs.

Sitelinks

Additional links with high positions in organic or paid results. They can appear in a more extensive form, showing more information about the content of the site, or less – depending on whether they match the user’s query.

Site Map

A sitemap is a list of URLs where we want to direct web crawlers, and XML is a universal markup language that allows us to describe data regardless of the platform or technology we use.

The site map should contain:

  • home pages,
  • information,
  • category pages,
  • products, etc.

Searchbox Sitelinks

The additional window displayed in the search results where the user can search the content of the website for information about the company. You need to install an internal search engine on the website to display the window.

Search Engine Query

Words or phrases typed by a given person into the search engine. To find a specific answer, you can also use search operators, for example: “OR”, “related:”, “*”.

Skyscraping

A method of finding popular websites, then writing better content and reaching out to websites with high domain authority that have already provided backlinks to those webpages offering your link with enhanced content.

Snippet Bait

A short block of content optimized for Google to serve as a featured snippet. It’s placed at the beginning of the article and usually responds to the query in 2-3 sentences so that Google can easily sum up this snippet of content.

Social Media Marketing (SMM)

The process of using social media as a channel that increases traffic, business visibility, and brand awareness by creating engaging content and posting it. Currently, links from social media can also influence your ranking in Google, so using them will pay off.

Spam

Any useless content or even entire websites that do not serve users and clutter the Internet.

Spam Blog

A method of creating blogs automatically by copying content from other websites and posting them on one blog. Although content is copied from various sources, it’s a black hat SEO technique.

Speed ​​Update

An algorithm update, where Google pays much attention to the website’s loading speed.

Sponsored Article

Form of advertisement placed on a widely read website, most often with a link pointing to our website. We pay for it the amount agreed with the publisher. The main intention is to promote the company’s offer in a relatively subtle way – in the form of an informative text, which, however, is marked as sponsored material. As part of an SEO contract, such articles can also be commissioned by an SEO specialist to increase the visibility of the positioned website.

SQT – Search Quality Team

The team was established by Google to verify the quality of search results using techniques like eliminating spam.

SSL – Secure Socket Layer

The protocol that is part of the HTTPS protocol, which is responsible for data security on the network and is used to encrypt data that is transferred between a web browser on a user’s computer and the website server.

More about SSL: TLS/SSL Certificate – What is it?

STA – Searcher Task Accomplishment

One of the ranking factors taken into account by Google. According to Google, a website that meets the needs and expectations of the user should be ranked higher.

Storytelling

A way of building content that is designed to engage the user just like an interesting story. Both the narrative and the impact on the reader’s imagination and emotions are important.

Street View

Function available on Google Maps or embedded on a given website. Thanks to Street View we can check the exact location we are looking for, also when walking the streets.

Structured Data

Data that helps web crawlers understand the content of a website. The schematics for these markups are publicly available on Schema.org.

Site Structure 

How your website content is organized. A good structure makes it easier for visitors to find what they are looking for, increases conversion and CTR, increases dwell time, and makes it easier to index your site. Since good structure equals good UX, the structure of your website has an impact on your search engine rankings. The optimal structure of the website should:

  • have a clear hierarchy: home page, 3-5 categories, and subcategories,
  • have an effective internal linking strategy,
  • use navigation, etc.

Supplemental Index

The secondary index of the Google search engine, designed for websites that web crawlers have marked as less valuable or a duplicate. You can check them by repeating your searches and taking into account the missed results.

SXO – Search Experience Optimization

A combination of SEO activities with UX, a process designed to improve the user experience on our website. SXO takes into account both the search engine and user-friendliness of the website.

Social Signal

An SEO term which describes social engaging activities such as shares, votes, pins, views, etc. It’s not entirely clear if it affects ranking, but it certainly does increase the likelihood of a site being cited (by increasing backlinks and improving brand authority).

SEO glossary – T

Target Group

A group of individuals that a company wants to target with its products and services. It’s consistent with demographic, psychosocial, and behavioral conditions (both online and offline).

Title Tag

T tag that describes the title of the page. It’s one of the more important tags because it allows web crawlers to determine what the given page content is about, so it should contain keywords that we want our website to rank for.

Technical SEO

Part of SEO that deals with making it as easy as possible for the search engine crawler to index your site and focuses on improving the rendering phase. Technical positioning is the basis of all other SEO activities and is essential if you want your website to perform optimally.

Technical SEO activities include:

  • improving website performance,
  • optimization of content and images,
  • improving site structure etc.

TF-IDF – Term Frequency – Inverse Document Frequency

A technique that can be used when creating content on a page where the weight of phrases is calculated based on the number of occurrences in a given text.

Thin Content

The amount that is insufficient for web crawlers and thus is classified as of little or no value to the user.

Tiny Text

Black hat SEO practice, in which some content is hidden from the user by making it small, so small that it’s almost invisible to the user, and yet it has been indexed by network bots.

TLS – Transport Layer Security

Part of the HTTPS protocol responsible for encrypting data and monitoring its integrity during its transfer between the browser and the website server. TLS protects data against extortion attempts, e.g. by unauthorized persons or bots.

More about SSL: TLS/SSL Certificate – What is it?

Traffic

One of the most famous SEO terms that describes the total number of visitors to a given website. Types of traffic:

  • direct
  • organic
  • paid
  • referral

Traffic Potential

A number of potential visits to your website that you can get while occupying the first place in Google search results.

TrustRank

Alternative created by Yahoo for Google PageRank, based on which the level of trust to a given website is determined.

Twin Website

A website that is very similar to another website – in terms of its content. Such a page has a negative impact on the positioning of the first – main page.

SEO glossary – U

Unique Visit

The first visit of a given user (with a given IP) to our website. Their number, e.g. per month, can be measured using e.g. Google Analytics.

Universal Search

Google results placed among traditional organic results, but which include additional media such as images, videos, maps, locations, etc. This is Google’s attempt to create dynamic search results where the user is dealing with various forms of search results.

Unnatural Link

Links that were created to manipulate the page’s ranking and lower its position.

URL – Uniform Resource Locator

Address of a given resource, which includes: protocol, server address (domain name), and additional path information.

URL Parameter 

Values ​​set dynamically in the page URL during the query. In page addresses, anything after “?” is a query parameter string.

User-Agent

Identification header sent with the HTTP protocol by the program (browser or bot) that connects to the web server.

User Engagement 

An indicator that measures the extent to which users engage with the content of the website.

User Experience (UX) 

A search engine ranking factor that describes how easy it is to use a given product or service (or website).

Factors that influence UX include:

  • simplicity and usability of the user interface,
  • site structure,
  • load time for website on various devices,
  • responsiveness and friendliness for mobile devices.

UX takes into account users’ feelings in particular. If it’s important to Google, it’s also important for the SEO of our website.

User-friendly

Term for devices or services that are easy to understand and use for any user.

User-generated Content (UGC)

Any content (blog entries, images, videos, posts in social media) published by internet users about our brand. It can have a positive impact on SEO because we can gain links and publicity on the web.

User Interface (UI)

Everything displayed on your site that the user can interact with. The user interface must be:

  • simple (avoid unnecessary elements),
  • transparent (use colors, sizes, and layout to create a hierarchy),
  • coherent (various elements should come together)
  • intuitive (the website should be easy to understand and the content should communicate what the user should do)

The purpose of the user interface is to create a positive impression for visitors. As these are factors that influence time spent on the website and engagement, they are critical to SEO.

User Intent

When creating content for the website and developing your services, it’s worth taking into account the user’s intention – the reason why he/she enters the search terms into the search engine (depending on the stage of the shopping path).

SEO glossary – V

Vanity URL

More friendly form of our company’s website address, which occurs, for example, when we share it in posts on social media. It’s usually a shortened version of the address that leads to a subpage with a longer URL link.

Vertical Search Engine

A search engine that focuses on a specific segment of online content – usually a topic or type of media.

Viral Content

Content that becomes very popular in a short time (e.g. post, photo, video). Most often it concerns recent events or situations that are easily picked up by users on the Internet. They are good for SEO, for acquiring links to our website, and for publicity.

Video optimization – vSEO, Video Search Engine Optimization, Video SEO

Actions taken on video websites (YouTube is the most popular of them) aimed at improving the visibility of a given video in search results. You can optimize both the entire channel and or a single video for example by:

  • identifying and using keywords in descriptions,
  • creating a description that will be helpful both for viewers and crawlers,
  • adding tags to each of the uploaded videos to define their subject,
  • creating thumbnails that will stand out from other videos in search results.

Vlog

Presenting content in the form of a video. The most popular platform where you can watch vlogs is Youtube.

Voice Search

A function that allows you to search for a given query in a search engine through a voice command.

SEO glossary – W

Web Directory

A type of website where we can find links to other websites and descriptions of what they contain so that you can quickly find the website you are interested in. Web directories are organized by topics, e.g. they collect all links related to recent events or specific types of products.

Webmaster

A person who deals with technical aspects related to the creation and optimization of a website. Webmaster is responsible for updating it or introducing changes that are necessary for website positioning.

Webmaster Guidelines

Specially created guidelines by Google dedicated to webmasters, with the help of which they can adapt websites to the requirements of web crawlers and users.

Website

Set of www pages in the same domain.

Website Audit

Assessment of the website and its links made by an SEO specialist. It’s the base for creating an individual positioning strategy (optimization, linking, website reconstruction, introducing new content, etc.)

Website Audit – SEO Verseo 

A proprietary tool made by Verseo, designed to help companies quickly check what works and what should be improved on the website. The free Verseo SEO audit will allow businesses that have never cooperated with an SEO specialist to quickly check what errors appear on the website and introduce corrections that will significantly improve its operation.

Website Quality

A set of features that search engine bots check when ranking your site. In determining the quality of a website, the following key elements are taken into account:

  • content relevance,
  • content length,
  • user involvement,
  • spelling,
  • social signals,
  • backlink profile,
  • legibility,
  • variety of media (videos and images).

White Hat SEO

All activities that rank a website in search results and are consistent with Google guidelines as well as fair to competition and network users.

White Paper

Official content, for example on specific areas of the company’s activity, which are designed to provide up-to-date information. They are primarily used for PR activities aimed at strengthening the company’s image on the web and building brand awareness.

WordPress

One of the most famous and popular content management systems, created in PHP. It has many website templates, which are easy to create from scratch.

SEO glossary – Y

Yahoo

Formerly the leader of an email, online messaging, and search; now used as an internet search engine by a small percentage of users.

Yandex

The most popular internet search engine in Russia with over 50% share in this market. In the world, however, it’s relatively unpopular with less than 1% share.

Yoast SEO

One of the most popular SEO plugins for WordPress. It can for example:

  • set the title and description of a fragment of search results,
  • check the readability of the content on our website,
  • set a keyword for a given content,
  • manage links and meta tags, etc.

SEO glossary – Z

Zero-Click

Search query with no clicks. Occurs when the user does not click on any result for some reason. One of the common reasons why a query does not get any clicks is because it’s a top-level informational search in which Google answers the question with some form of rich results. For example, the query “how old is Brad Pitt?” gets the answer already in the search results.

Zero-match Anchor Text

Anchor link whose phrase does not include any content informing what we will find exactly after clicking on the link. Most often these are links containing CTAs like “click here” or “check here”, etc., which direct to a specific subpage or landing page.

Zero Moment of Truth (ZMOT)

A marketing concept created by Google, according to which consumers, when making a purchase decision, also pay attention to opinions, comments, and information about the product or service available on the Internet. Therefore, it’s so important to remember about this in the context of your company, as this may affect the purchasing decision.

Summary

That’s all, Folks!

New call-to-action
New call-to-action

Over 300 terms in the field of SEO! A huge dose of knowledge! We hope you found the term you were looking for and that the SEO Glossary did its job. If there is a term that is not included here, feel free to let us know, we will be happy to add it so that other users have a full knowledge base.

Website Security – How to Keep Your Website Safe

   |   By  |  0 Comments

website security

The term “online security” is everywhere: public announcements, notifications from our banks, and articles about new data leaks. And all this information goes – as our parents used to say – in one ear, and out the other, with us usually doing nothing about it. It’s time to change that. Remember: better safe than sorry! How can we ensure the security of our websites?

Cyber Attack Types

Can my website be hacked? Is my website worth hacking? The truth is no place on the web isn’t worth breaking into. Each domain and each page can be used to build links to  ads for penis enlargement, redirect to suspicious domains, or silently use our server for mailing porn sites to innocent Internet users. The number of ways to use a hijacked website is limited only by the hackers’ creativity.

Once a website is hacked, it can have various consequences like malfunctions or dropping out of the Google index as a result of removed optimization. Hacking also increases the risk of your website being removed from the SERPs as a penalty for illegal content. In addition, each time the website gets hacked, it loses credibility due to data safety concerns. It’s user data that attracts hackers. In some cases, hack attacks will cost you money in the form of lost potential income during your website’s downtime. The costs of server cleanup and site restoration can also make a big difference when it comes to your budget. There are also legal consequences.

Who Is Most at Risk?

Every website and every person using the Internet may fall victim to many types of cyberattacks. Even large and well-protected companies suffer security breaches.

  • Adobe lost data in 2013 – usernames and passwords, full names, credit and debit card details – from 153 million accounts,
  • LinkedIn leaked email addresses and passwords in 2012 and 2016,
  • Garmin lost access to a large part of its systems in 2020 and was forced to disable access to all services. The company paid a ransom to gain access to the data decryption key,
  • Source code for the games Cyberpunk 2077 and The Witcher 3 was stolen by hackers from CD Projekt RED in 2021.

The scale and damage of these break-ins is greater than what small website owners usually experience, though this doesn’t mean that you should ignore safety precautions and not take appropriate steps. Remember: better safe than sorry!

The popular CMSs WordPress and Joomla are usually mentioned as systems that fall prey to hack attacks, the frequency of which is influenced by factors like the immense popularity of these platforms and a rather frivolous approach to security updates and notifications. In the case of WordPress, plugins are an additional source of attacks, as some are third-party software. The vulnerability of Joomla can be observed during system configuration, which tends to generate errors. WordPress acts similarly when installing external extensions.

Ensuring Website Security

Here are six basic solutions for raising website security. 

  1. Password123

Professional tools and the most complex security programs can fail due to lack of common sense. Passwords are the first line of defense when it comes to online accounts. Common password failings include:

Verseo Ads Banner
Verseo Ads Banner
  • Simplicity – passwords such as qwerty, 12345, abc123 or passwords associated with the website name. This type of password is easy to break as they are quite intuitive and often found on lists of stolen data. 
  • Universality – one password for everything? Simple solutions like this work well… for a short while. Data leakage from one website may result in the loss of accounts in other websites as well.
  • Keeping all passwords in one place – a notebook or an ordinary text file is also a bad idea. While a traditional notebook may fall prey to a less IT savvy  thief, a text file may be targeted to anyone who knows how to take advantage of users’ trust in public networks (e.g. wifi in a restaurant) – an easy path to taking over all our Internet accounts is a short time.

So how do you keep your passwords secure? One of the solutions that work well on both private and business accounts is applications that store passwords in a safe way, such as LastPass, 1Password, or their free equivalents – KeePass and BitWarden. These solutions are similar to the criticized notebook – but they store passwords in one place, and are much more secure. These programs generate extremely complex, encrypted passwords. The users only need one that allows them to access the application. You can use programs of this type only on computers but also on mobile devices.

Multi-tier verification is another effective way to limit the possibility of hijacking your accounts. Implementing this solution requires user verification on several platforms, like a phone or tablet, allowing for additional control and preventing unauthorized login attempts.

Passwords should also be changed from time to time, especially in the case of rotation among company employees. After changing the people responsible for various internet activities, it’s a good idea to refresh the passwords and make sure that only relevant employees have access to the password-protected content.

  1. Updates

Updating systems is another element that reduces the risk of unwanted activity on a website. This applies to all elements – CMS, software on the server, or plugins and extensions. The update should also eliminate bugs and potential threats existing in the earlier code in addition to developing the application and adding new functionalities. The lack of software updates running within the website makes it much easier to take over the website.

You can find out how often security patches are implemented by analyzing the information provided with subsequent software updates. Most updates go hand in hand with a blog post or note detailing each subsequent patch.

Information on WordPress can be found at the official WordPress website.

Joomla announces updates here. 

CMS update info usually appears on the main page of the administration panel. Note – updating older CMS versions (e.g. those that have not been updated for several months) may result in problems with the basic functionalities of the website. Backup your website files before starting the update. Use help from specialists – e.g. a software house that was responsible for the original website launch. Remember to update the system on which the website is hosted when designing the website and include h updating assistance in contracts.

New call-to-action
New call-to-action
  1. Plugins and Extensions

Each of these offers new functions for our CMS, unfortunately – not always only the ones we are interested in. So how do you recognize safe plugins? What to look for?

Before installing an add-on to our CMS, it is worth taking a look at its popularity and reviews, checking the credibility of software developers, and checking how often updates are published and what they are about.

Leaving plugins without updates is a mistake. Bugs can lead to taking over your website, and they need to be dealt with.

  1. SSL

SSL (Secure Sockets Layer) is an encrypted Internet security protocol. Initially developed by Netscape in ‘95, SSL is the predecessor of TLS encryption which is widespread today. Transport Layer Security (TLS), is a cryptographic protocol created to ensure communication security in computer networks. SSL or TSL guarantees the confidentiality of data transmission and server authentication. It is based on asymmetric encryption with a public key. 

  1. Remember To Backup

Caution may be another point on the list of monthly website maintenance costs, but in times of crisis, it becomes priceless. Backups stored in a safe place reduce the risk of data loss. Securing databases and placing them in a different space than the original files will enable business continuity, even in the event of failure or losses caused by burglary.

Backups will also be stored by the hosting on which the website is located, which is very useful in the event of technical problems with the website – restoring the state from two or three days ago is usually a matter of one phone call. Unfortunately, leaving the security of your websites in the hands of third party companies doesn’t always work.

  1. Educate Yourself and Your Coworkers

The security of a website depends on many factors. The negative impact of some   can be reduced by education. Sometimes it’s enough to brush up on the basics of online security – the rules of creating passwords, secure internet connections, and “log out after work”, or to be suspicious of  unusual-looking emails.

Websites are vulnerable. Web security and all resources necessary for our company doesn’t require specialized knowledge or technical skills – in many cases, all it takes is implementing best practices.

New call-to-action
New call-to-action

Building Backlinks

   |   By  |  0 Comments

building backlinks

How did Google positioning start? The algorithm determining the search results was simple – and the number of links to the page had an impact on position. Today, web crawlers aren’t so easily manipulated and there are many other factors, although links are still valuable. Where should we build backlinks and how can we make the operation truly effective?

Quantity Over Quality: Linking in the First Decade of the 21st Century

In the past, SEO was simple and keywords were included in every line of code. Nobody cared about the user’s needs: not only because of the lack of technical capabilities (which evolved with the growing popularity of HTML5 and CSS3), but also because there were fewer users. 

How did we backlink several years ago? The most accurate answer: with no limits. Initially, it was all about quantity, so there were tons of spam websites called pre-sell pages, commonly used as the location for building backlinks. Some were not moderated – it was enough access to the site, set up an account, sometimes pay, and prepare a text to be used in a spinner. These types of programs enable creating synonymous texts that could be used on a larger scale. The texts were similar, but the use of synonyms gave the illusion of uniqueness. However, reading them was not recommended – they did not contain valuable information and were merely sites to store links to positioned pages.

Over time, the significance of links has changed – high positions were achieved by links from educational (edu) and government (gov) sites, and the role of blogs with junk articles began to decline.

Penguin – A Google Update That Turned the SEO World Upside Down

In 2012, an update was added to the search engine algorithm that ended the golden age of careless linking. Penguin reduced the importance of links from spam sites and diminished the possibility of black hat link-building practices.

Penguin analyzed backlinks and assessed whether they were natural. Links from cluttered presell pages, low-quality internet forums, and other sources of poor reputation received very low scores – resulting in their drop in Google ranking. The update also detected duplicate link patterns suggesting the activity of scripts adding links on a massive scale.

Initially, the Penguin update was only active from time to time – its impact was mostly manifested by a rapid change in a given site’s visibility and a decrease in traffic. The Penguin filter forced SEO specialists to rapidly “clean” the profile of incoming links. Such a removal of low-quality links allowed for a steady ranking increase, but a return to high ranking positions was not always possible. In some cases, a banned page only returned to the Google search engine after several months – when a new version of Penguin was added to the algorithm.

The update premiered at the end of March of 2012, and the effects of its subsequent versions could also be seen in October of 2012, May, and October of 2013, and October of 2014. In September, 2016, Penguin combined the basic elements of the search engine algorithm and – from that point – it has worked in real-time. Its constant activity enables faster reactions to the negative effects of ineffective linking – without having to wait for the next update. Additionally, since Penguin began functioning in the real-time mode, it has been more forgiving for spam linkers: disputable links simply tend to be ignored. However, this doesn’t mean that you can link everywhere now – sites that employ spammy tricks can still be punished by their elimination from search results.

Linking methods have changed from Google’s early days, but linking continues to have a significant impact. It is, therefore, advisable to take a closer look at the correct link structure.

A link is one of the basic elements of HTML – a markup language commonly used to describe web pages. It is an element leading from one document to another which is most frequently displayed after clicking on an indicated tag. The global Internet structure is based on links.

What does a link look like?

<a> – tag

href – an attribute containing the link’s landing page address

anchor – (“Verseo” is an example here) link text, visible on the website.

Verseo Ads Banner
Verseo Ads Banner

A link can point to documents in the same document (href = “#”), within the same domain (href = “/ about-us /”) or to any document online (href = “https: // other-page.de ”).

In the <a> tag, there may also be a title displayed when you hover the cursor over the link, as well as a target determining whether the document will open – e.g. in a new browser tab. In the hyperlink tag, you can also add guidelines for crawlers, e.g. prohibiting them from going to a specified URL.

In positioning, the link anchor element is crucial. It’s here that you can place a keyword forming the “title” of the link pointing to our website. When planning a link-building strategy, selecting an appropriate anchor will be one of key elements to make our activities natural. The most essential thing here is diversification in the employed methods of link labeling:

  • URL address – anchor is the website address pasted into the content of the article or post. It looks the most natural; after all, nobody wants to describe a link sent in the forum posts or the comments. Hyperlinks with an anchor containing a URL are often created automatically, for example: by pasting the address into text in Google Docs;
  • brand – name of the business or website to which we link. It is a safe form of link naming that allows you to promote your brand;
  • keyword – a more risky form of linking: its excessively careless use may be punished by the search engine. An anchor link containing a keyword does not look natural. Keyword linking helps in website positioning, but it should be used as one of many solutions;
  • image – a link to the page can also be placed within a photo or a file with the business’s logo. It is then necessary to describe it following the requirements of the <img> tag.

Links can be placed in every section of the website – in the menu (the top or side one), in the text, in comments, as well as in the footer. They can be located on one subpage of the website or repeated within a category or the entire domain – in such a case, they are called site-wide.

With the beginning of the Penguin era, collecting links to a site has become difficult. Finding a good source of links requires a selection and rejection of low-quality pages – e.g. presell pages dominated by chaotic texts with a high concentration of keywords.

So how should we build backlinks? It’s advisable to look for high-quality link sources – ones with good Google ratings. The search engine doesn’t officially inform about given site scores, but there are external tools providing this information. One of them is Ahrefs which uses the Domain Rating index – evaluating link numbers and quality.

When looking for the perfect place to leave a link, we should also consider the number of users visiting the site. High organic traffic means a high visibility level of the domain – which says a lot about Google’s assessment. Such information can be obtained with the help of services like SEO Surfer.

Another important element is the website’s link profile –  the link makeup. The number of outbound links is also crucial. Too many links weaken the “link juice” directed to our website. What is this “link juice” exactly? It is simply how the power (authority) is transferred from one domain to another. An excessive number of links to external websites “waters down the juice” and reduces the backlink value.

Apart from these techniques, users seeking links should focus on websites thematically related to the content of their own page. It will also be helpful to build links from local websites promoting activities in our country, province, commune, or city. Links to a domain in a given language should be placed on native websites.

Building links is a time-consuming process – for as long as the idea of ​​good online visibility is important to our business. Activities in this field should be wisely planned and take into account other aspects of marketing actions implemented by our brand.

Let’s discuss the most popular link sources:

Natural

Natural links are created without the participation of interested parties – Internet users spontaneously share interesting links in comments, forums or their blogs. Links of this type mean that the content on our subpages is of good quality, and sharing it by users can bring a lot of traffic to our website.

Building links in this way will require a lot of involvement in the preparation of website content. Users recommend tips, rankings, summaries, and attractively presented smaller doses of knowledge, e.g. in the form of infographics. In some industries, we can also prepare witty materials that will have a chance to become viral and quickly reach many potential recipients. After obtaining such online recognition, links will likely appear complementing the profile of the domain which we want to position.

Forums

Internet forums have lost popularity in recent years in favor of groups on Facebook, but those that have survived online remain a source of high-quality links. Adding links there, however, must be consistent with the thematic range of such forums and their rules – which limits the possibility of spamming. Links placed in a response to a question asked by a user have the best chance to be accepted by strict admins and can support the site for a long time. It’s advisable to link where we already have an account and previously participated in the discussion.

New call-to-action
New call-to-action

Comments

Certain websites and blogs allow you to comment on articles and even supplement them with active links. Links that we post there should be thematically related to a given entry. Adding a link in a comment doesn’t automatically mean it will be approved by the webmaster. In addition, such links will often have the rel = “nofollow” parameter – which won’t allow crawlers to go to our site, but users can still click on them.

Catalogs

It’s one of the older forms of link-building – one which enables a large increase in the link numbers in a relatively short time. However, modern catalogs are of higher quality and demand more care of the link provider. A good catalog will require supplementing the entry with basic information about the company’s activity, a unique description – and sometimes also images. Catalogs enable additional activity promotion, e.g. by higher ranking positions of our entry in the website’s search engine for services or by placing a link to the entry on the catalog’s main page.

Guest Articles

A fairly popular method of building links and attracting new readers is publishing articles outside your website – e.g. on a blog. A guest article allows the blog owner to get new, valuable content. Such publications shouldn’t have an advertising character, but rather focus on providing information and sharing knowledge. The possibility of adding an active link to such content depends on the site’s owner.

Sponsored articles aim to promote the given site, although such texts don’t have to be of a typically advertising nature. Articles usually contain 1 to 2 links – most often with an anchor of our choice. Such service is paid, and the cost depends on the site’s reputation – the more traffic and the bigger the website, the higher the price.

Cooperation in the field of sponsored articles can be established in person, through the site’s administrator and the marketing department. Websites specializing in mediating between publishers and clients looking for space for marketing activities are also very popular.

Buffer Sites

Buffer sites constitute good sources of links. They are web pages – mostly thematic ones – where links to the promoted pages are generated. A buffer site requires the purchase of a domain, space on the server, and a CMS system or another solution allowing you to conveniently add articles. Additionally, content creation should be included in the costs – especially if we don’t plan to do it on our own.

For building buffer sites, it’s a good idea to pick up an expired domain. Those with a good history are especially valuable: the content previously published on it coincides with the thematic content of our website, the website hasn’t been penalized by Google, and the links come from trusted sources.

High-quality buffer sites can also be the subject of other marketing efforts – well-written, popular articles can become an additional source of traffic and an element in building the expert position.

Partner Sites

The cooperation of two companies can be expressed in many ways – also by mentioning partners or contractors on the website. The partner’s logo is usually placed on the website, in the <a> tag. When analyzing your marketing activities, it’s advisable to review the list of places with our business name and ask to have a link to our website added there.

It’s also a good idea to approach other places in this manner – e.g. lists of companies offering given services in the region prepared by an independent entity. It’s advisable to find those operating for a long time – and provide your company information.

Offline Activity

One of effective ways of building high-quality links is to increase your influence also in the field of offline activity. Actions of this type are very effective, especially in local positioning: social activity, participation in meetings, or financial support for local events – which usually results in mentioning our brand’s name on those events’ webpages. These “few words about us” can always be supplemented with a link to our website (usually an anchor with the name), which will perfectly complement link-building activities. It’s also advisable to mention biographical notes, e.g. on the websites of universities or conferences in which we participate.

Social Media

Links from social media don’t make our site climb faster in SERPs, but they help to increase our website’s traffic. Profiles in social media are an excellent communication channel with clients. In addition, links to attractive content are usually transferred faster via social media.

Other Linking Ideas

Building valuable links isn’t an easy task and it frequently requires creativity. Some approaches will be effective in one industry and not the others. Building a link in Wikipedia is a great example. When promoting websites concerning famous people – athletes, politicians, etc. – the publication of an entry with a short biography and a description of activities won’t be blocked by editors of the online encyclopedia. An attempt to build backlinks by a company specializing in screws, however, will require a lot of effort, e.g. creating a source complementing Wikipedia content – and will probably be rejected anyway.

It’s also advisable to study what the competitors are doing – thanks to tools such as Ahrefs, Majestic, or Seo Surfer, we have the opportunity to see links to their websites. These tools, however, won’t provide information about all links – for example, some sources may block bots collecting this type of data. Copying linking sites from competitors allows you to build a good link to your site and weaken the link pointing to your competitor – following the principle of  “watering down the juice”.

New call-to-action
New call-to-action

When looking for places to build links to your website, it’s advisable to pay attention to the quality of links and emphasize their “natural” character. Rapid link building usually doesn’t improve the website’s ranking position. As in the case of some on-site activities, off-site ones also pay off thanks to systematic work, control, and the result analysis.