VEVIDA cannot help you register your website(s) with search engines. However, we have prepared some tips and recommendations for you on this page.
There are several factors to consider in order to improve your websites’ search engine rankings.
1) Description & keywords
You can use meta tags to give browsers or search engines information on your document, such as a description of the document’s contents. Example:
<meta content="VEVIDA is een internationale Webhosting Provider voor professionele webhosting"> <meta content="VEVDIA, webhosting, asp, asp.net, php, hosting, frontpage, e-mail, web, www">
These meta tags are placed between the <head></head> tags.
However, because so many webmasters misuse these meta tags for spamming purposes, such as repeating keywords to increase their ranking, some search engines no longer use these keywords. So do not choose too many keywords, and keep them relevant.
Once you have some good keywords, it is important to keep them up-to-date. In other words, you should modify them as needed.
Search engines look in the website root for a special file called robots.txt (http://www.yourdomain.com/robots.txt). This file contains instructions for robots (spiders) regarding which files can be indexed. This system is called the Robots Exclusion Standard.
The robots.txt file uses a standard format. It is made up of records, with each record in turn consisting of two fields: A User-Agent (UA) line and one or more Disallow: lines. The format is:
<Field> ":" <value>
The file must be created and saved in text format.
The User-agent line designates the robot, for instance:
The use of the wildcard sign (*) is permitted to denote all robots:
The second part of a record consists of Disallow: directives. These lines designate files and/or directives. For instance, the following line means that spiders cannot index the file email.html:
Directories can also be entered:
This blocks the cgi-bin directory for spiders.
The Disallow: directive features a wildcard: The standard states that /email means that neither the file /email(.html) nor the files in folder /email(/index.html) will be indexed. If the Disallow: directive is left blank, then EVERYTHING will be indexed.
At least one disallow line must be present for every User-agent directive.
A blank robot.txt file will be ignored.
Robots.txt lines can also be used in meta tags, for example:
<meta content="ALL" /> <meta content="noindex,nofollow" />
A website has a title, such as “VEVIDA Internationale Webhosting Provider – ASP.NET Webhosting!”. Select a title that suits your website. However, every page in your website can have its own title as well. If you have a website for your farm, then the tag for the first page could be:
<title>Boerderij fam. Janssen</title>
If the second page is about your pets which also roam about the farm, you could call it:
<title>huisdieren van de fam. Janssen</title>
Many search engines also index the text on your website. Images and flash components are also content of course, often very attractive content. Unfortunately however, they are not indexable. For this reason, we will use the term ‘indexable content’ here.
A good text, relevant to your topics (harmonize this with your description, keywords and title), will be indexed higher than texts containing gibberish, or irrelevant texts. If you want to attract visitors, you should not write about assembly programming on your farm’s website. Unless of course it is a hobby. Another advantage of good, clear, attractive text is that your visitors will want to come back to your website. Part of making (and keeping) your website attractive is including ALT tags for images, ACRONYM tags for foreign and/or difficult words, TITLE tags for hyperlinks, etc.
A website that meets the W3C (World Wide Web Consortium) standards for (X)HTML/CSS will be displayed properly in every browser. Visitors will not be required to use a specific browser (e.g. only Internet Explorer, or Mozilla). This will help you reach a wider audience. You can find a great deal of information and tutorials on the W3 Schools website.
Many search engines use a kind of page ranking: the greater the number of websites that link to your website, the higher your website will appear in the results. Some places, known as ‘directories’, allow you to register your site for free, or in exchange for payment. Such as:
or large websites such as Startpagina.nl (or one of its subsidiaries).
Be careful: emails sent to webmasters requesting links to your website are often considered spam.
6) Search engines
Once you have taken all of the above into account, it is time to register your website with search engines. You can do this at engines such as:
Please Note: Once you register your website, it may take up to several weeks for it to appear. Simply registering is no guarantee that the website will actually appear.
If you are using certain online services that register your website with 30 – 40 search engines all at once, then it is likely that you will receive a lot of spam because you had to provide your email address.
You can edit tags, text and HTML on the basis of your website statistics (which you can access by default by typing /stats/ after your domain name). The reason for this is that search engines change indexing methods frequently.
Website and database optimization (forum, CMS, etc.)
[This information discusses the use of PHP in combination with a MySQL database, but is equally applicable to all other scripting language/database combinations]
Today, many websites are made using a CMS (Content Management System), with an underlying database. Websites can get slower over the course of time. Some tips are given below to prevent this and improve speed.
1) MySQL database
- a) Just like data on your computer’s hard drive, data in a MySQL database gets fragmented (spread out) over time. When data get deleted (information like news articles, replies, forum posts, etc.), gaps are created between the data in the database. Just as you have to “defragment” your hard drive from time to time, you also have to “optimize” your MySQL tables.
This takes all of the data and places them neatly together, thus freeing up space on the database and enabling faster information retrieval.
You can optimize a MySQL database via the phpMyAdmin environment. Select all tables and select the “OPTIMIZE” action.
Aside from data getting fragmented, a table can also get corrupted or damaged. A regular “ANALYZE” or “REPAIR” action via phpMyAdmin is just as important as running an “OPTIMIZE”.
- b) Keep the database as small as possible. Regularly delete irrelevant data, such as spam, outdated files, etc. The smaller the database, the better.
A CMS can be regarded as a framework that you can use to develop extensive dynamic websites, integrate components, etc.
- Select the smallest possible template for your website. Change .php, .css and .js files and delete items such as comments and excessive spaces. Be sure to make a backup of the files first, in case too much is deleted. Test the changes thoroughly. Where possible, try to merge .css and .js files together to minimize the number of HTTP requests.
- In the administrator area (the “back-end”), deactivate add-ons, components and modules that are not in use. Next, delete their corresponding directories via FTP, and MySQL tables via phpMyAdmin.
- Activate caching mechanisms from the back-end or in the code (PHP Output Buffering Control, Smarty template engine, PHP PEAR Cache-Lite), or install special cache plugins.
- For better performance, remember: use common sense.
Is it really necessary to have cool effects with “JCE Utilities” if the “JCE editor” is installed?! All of those slick Ajax pop-ups require additional .css and .js files, additional HTTP requests, they increase the total page size and increase page load time.
- Use pre-compressed images, with the .jpg or .png extension. These file formats have already been compressed (reduced), unlike .bmp files (bitmaps). Make thumbnails (reduced versions of images) yourself instead of having this done by a script or server. The Microsoft program Image Resizer, from Windows XP PowerToys, is a good tool for this.
- Determine what content should be displayed on the main page. Divide the rest of the content up into different categories and sections.
- Deactivate a search function from a forum, CMS or website completely, or, for instance, only for unregistered visitors.
3) On the web server
The web server comes with various techniques, activated by default, for caching and/or sending compressed (deflate or gzip) content (data) to the browser (visitor).
Some examples here would be HTTP compression (for static pages, featuring extensions such as .htm, .html, .txt, .css, .js, .xml and .rdf), FastCGI ISAPI for PHP (where PHP remains loaded in the server’s memory), Zend Optimizer and IonCube Loader. Or even a mysqli PHP extension, for faster and better MySQL access from PHP scripts.
4) And some more…
By default, a browser (Internet Explorer, Firefox, Safari, Opera) only makes two simultaneous HTTP requests per website URL. A website with 6 .css files will require three times two HTTP requests. This will eat up time. For Internet Explorer, this value can be adjusted in the Windows Registry. For the procedure, please see: http://support.microsoft.com/?kbid=282402. In Firefox, this can be adjusted via about:config (Preference Name: network.http.max-persistent-connections-per-server).
In addition to this, it is also possible to divide images and .css and .js files over different URLs, or DNS host names (host headers). For instance, images1.yourdomainname.nl for standard images, images2.yourdomainname.nl for uploaded images and css.yourdomainname.nl for .css files. This way, the different website components are retrieved from different URLs with two HTTP requests per DNS host name.