Eskimo North Web Site Development Information and Guidelines
Every user account, whether it be a remote shell account or a dial-up subscription, comes with the ability to host a web site. Please do not use this site for any illegal activities. Adult material is permitted however we have a separate server name for adult material and it must be confined to that server.
In your home directory there are two places you can put web files:
public_html This directory maps to www.eskimo.com/~login.
Only web pages suitable for all ages may be placed in this directory.
adult_html This directory maps to adult.eskimo.com/~login.
Adult oriented web pages should be placed in this directory. If you intend to host an adult oriented virtual domain here, please let us know and it will be mapped to adult_html space. Please include an entry page that restricts access to adults only.
We do not wish to restrict your freedom of expression. We only wish to allow those who consider adult material inappropriate or offensive to avoid encountering it. There is no difference in the performance of web pages served out of adult_html from those served out of public_html, they are both served from the same platform with the same bandwidth and machine resource availability.
If you have a virtual domain which allows your web files and ftp files to exist under your own domain name, the root of the virtual domain would normally be mapped to public_html or adult_html in your directory but if you have multiple domains or wish to have your virtual domain separate from your personal web site, it is possible for us to map your virtual domain to a sub-directory of one of those directories.
You can create your web pages on-line using any of the available Unix editors and various image manipulation tools, or you can create your files off-line and then upload them to our site. Either way, your main web page must be named index.html, index.htm, or index.shtml (if using Server Side Includes, described below) and placed in a directory called public_html if it is for a general audience or adult_html if it is material intended for an adult audience.
To ftp your web files created off-line to our site, ftp ftp.eskimo.com. Be sure to use ftp.eskimo.com as the ftp host, not eskimo.com, or you will be unable to connect. Another thing that will prevent you from being able to connect to our ftp server is if the DNS for your originating IP address is incorrect. Your IP address must have forward and inverse DNS and they must agree with each other. Login to ftp using your login and password rather than anonymous. This will place you in your home directory here.
If the public_html or adult_html directory does not already exist, you will need to create it before you can upload or create web pages on-line. You can do this by typing:
mkdir public_html -- or -- mkdir adult_html
The mkdir command in ftp will create a directory with read, write, and search permissions for the owner, and read and search (but not write) permissions for everyone else. There is no need to change the permissions when you create the directory in ftp. If you create the directory from the Unix shell, the permissions the directory is created with will be influenced by the your umask value and may not be initially correct. If you created the directory from the Unix shell, set the permissions with the following command:
chmod 755 public_html -- or --- chmod 755 adult_html
Now you are ready to create or upload your web files. Move to the directory you just created, using the following command:
cd public_html -- or -- cd adult_html
If you are uploading your web files, you can now use the ftp put command to send each file:
put index.html put otherfiles.html
The ftp put command will create the files with read and write permissions for the owner and read permissions for everyone else. These are the permissions you want so it is not necessary to change the permissions if you uploaded your web files with ftp.
If you are creating your files on-line you can use the text editor of your choice to create the files now. The permissions on the files will be affected by your umask value so they may not be initially correct. To set the file permissions correctly use the following command for each file that you upload:
chmod 644 index.html chmod 644 otherfiles.html
Under your public_html or index_html directories you can create any number of sub-directories and web pages subject only to the limits of your disk quota. You can use sub-directories to organize your web files. For example, you may wish to create a sub-directory called images and place your images there. You might want to create additional subdirectories under images to further organize the image files. The directory separator used by Unix is the forward slash, '/' rather than the backslash '\' used by Windows.
CGI on Your Web Site
You can use CGI programs on your web site. To do so you will need to create a cgi-bin directory under your public_html directory and you will need to make it mode 711:
cd ~/public_html mkdir cgi-bin chmod 711 cgi-bin
Then place your CGI programs in that directory with an extension of ".cgi". The extension needs to be ".cgi" regardless of what type of CGI program it is, that is, perl, perl5, python, shell scripts, a compiled program of some sort, all need to have an extension of ".cgi". The mode of your CGI programs should be 500. Set the mode with the following command:
chmod 500 program.cgi
Perl is the most frequently used CGI programming language. To use a perl CGI, the top of your script has to properly specify the perl interpreter. The first line in a perl4 CGI script should have:
The first line of a perl5 script should have:
The web servers here are Sparc based systems running SparcLinux. You will need access to a SparcLinux platform, or a system with a cross-compiler in order to compile C CGI programs for use here or you will need to ask us to compile them for you. Our shell servers are running SunOS 4.1.4 and so programs compiled on them will not run on the web servers. There is a SunOS compatibility capability in SparcLinux but it will not work with more than 256 descriptors and unfortunately, the web server requires a higher number.
If you have a virtual domain, your CGI scripts would be called as follows:
Programs in the system cgi-bin directory are called this way:
Presently, you can not have a cgi-bin directory under adult_html. This is because the wrapper only allows them to exist under one directory. So if you have an adult site, it will be necessary to put CGI's under your public_html directory and call them from your adult web page. If you have permissions set correctly, nobody will be able to see them from the main web server so this should not be a problem.
CGI programs execute with your user ID and your group ID. Evil people can assume your identity and do evil things if you let them.
Poorly designed programs can result in damage to your files not limited to your web site. When you put a CGI program on your web site you are allowing people all over the Internet to execute programs with your permissions.
Be very sure that your CGI programs do not allow a remote user to execute arbitrary commands or read or write arbitrary filenames. Programs should be very careful to eliminate any "../" back references, wild card or regular expression references to filenames or commands, references to files or commands starting at "/", and potential buffer overflow conditions which can allow someone to cause arbitrary commands to be executed with your permissions.
PHP on Your Web Site
You can use PHP programs on your web site. PHP programs must reside within your public_html directory and have an extension of ".php". The mode on your PHP programs should be 400. Set the mode with the following command:
chmod 400 program.phpNo special consideration is required when calling PHP programs from within a virtual domain. The file mapping of PHP files is the same as the mapping of regular HTML files within your virtual domain.
For information on programming in PHP, please see the PHP online manual:
PHP programs execute with your user ID and your group ID. Evil people can assume your identity and do evil things if you let them.
Poorly designed PHP programs can result in damage to your files not limited to your web site. When you put a PHP program on your website you are potentially allowing users all over the Internet to do anything you can do from a shell account with your permissions including the execution of arbitrary commands and reading, writing, or deleting files. The dangers of PHP code are very much in parallel with those of CGI scripts. Owing to the greater flexibility, complexity, and speed of PHP, it's much easier to write programs with unintended capabilities, and abusive individuals can abuse faster and more efficiently than ever before. Therefore great care must be taken when writing PHP code.
As with CGI programs, when writing PHP code, it is important to avoid allowing a remote user to execute arbitrary commands or read or write arbitrary files. Programs should be very careful to eliminate any "../" back references, wild card or regular expression references to filenames or commands, and references to files or commands starting at "/" which can allow someone to cause arbitrary commands to be executed with your permissions.
PHP automatically allocates variables and grows them as necessary, so unlike many CGI languages, buffer overflow exploits are generally not a concern within the PHP program itself. However, if PHP calls or passes data to any external programs, the opportunity for buffer overflow exists there.
With PHP, another concern is the possible use of uninitialized variables. Even though PHP allows this, it is good practice to ALWAYS initialize variables. Otherwise it is possible for a variable to be initialized externally and have a value that your code did not expect.
Presently secure browsing is available on the "commerce.eskimo.com" server only, due to the way domain certificates are obtained. Using this, you can use a CGI script in your directory to send secure data, such as preparing credit card transactions (we're currently working on that aspect; until we have that up and running, you'll need to use another server for the actual processing of such card data). Use the following type of URL for secure browsing:
https://commerce.eskimo.com/~username/filename.html -- or -- https://commerce.eskimo.com/~username/cgi-bin/program.cgi
Electronic Commerce Solutions
Selling is tough. The last thing you want to do is to make it difficult for your customers to purchase your products and services. E-Commerce Exchange Solutions make it easy by allowing you to accept payment via major credit cards or electronic checks. The ability to schedual automatic recurring payments is also provided.
If you are designing online stores or point of sales systems for customers or , you can also earn a commission for referring customers by joining the E-Commerce Exchange Affiliate Program.
Server Side Includes
Server side includes are allowed with the exception of Exec which is disallowed. Although it is slightly more work (you have to spit out a header), just about anything you could do with Exec can be done with a CGI script.
Normally, ".html" files are not parsed for SSI directives. There are two ways to tell the server to parse a file for SSI directives. If you give the file an extension of ".shtml" the server will parse the file for SSI directives. Also, if you set the user execute bit the file will be parsed for SSI directives. To do this type:
chmod u+x file.html
If you set the group execute bit, it will allow the content of the file to be cached in proxy servers. Do this for STATIC content (pages kept in a file). Do not do this for dynamic content such as a CGI script which produces different output each time it is run. Do not do it in situations where you need to disallow content such as for a counter or banner advertisement where a different advertisement is displayed for each hit.
Document Cache Control
Many ISPs use caching proxy servers that intercept customer requests for web pages and serve them out of cache if allowed and if they have the requested page in cache. This provides a response to the end user faster but risks providing a stale copy of a document that has since been updated.
We allow web hosting customers to determine whether or not they wish to allow a given web page to be cached. If the group execute bit is not set on a web page, the web server will not generate a last modified date header and remote caching servers will not be allowed to cache your web page.
If the group execute bit is set, a last modified date header will be generated and remote caching servers can cache your page and will use that header to determine when something should be expired from cache and a fresh copy fetched.
You should allow caching on web pages or images that never change or that change infrequently. You may want to avoid allowing caching on pages that it is critical the end user get the most recent version because a caching server may serve an old document if it hasn't expired from cache yet.
You should avoid allowing a file to be cached if that file that generates dynamic content even though the file itself hasn't been updated. Otherwise users sitting behind a caching server will only see a static document that never changes.
To allow document caching: chmod g+x file.html To disallow document caching: chmod g-x file.html
Password Protecting a Portion of Your Web Site
Another reason that you may wish to use sub-directories is so that you can password protect a portion of your web site. You may wish to do this in order to provide information only to internal members of an organization or to members of a subscription service. The standard htaccess method of password protection is supported. To use this method you need to create two files.
The following assumes that you are logged into the Unix shell. This is necessary because some of the necessary commands like "encrypt" are not available from ftp. There may be a Windows program that can generate encrypted password data but I do not know of any off hand.
One file called ".htpasswd" contains a list of users and encrypted passwords. This file should be placed in your home directory tree but it should NOT be placed under either your public_html or adult_html directory. The reason for this is that this file must have public read permission in order for the web server to be able to access it but you do not want it available for download via the web because people can run dictionary attack programs in an attempt to discover the passwords that are encrypted within. The format of this file is as follows:
To create the encrypted passwords, use the encrypt command from the Unix shell:
You get a response that looks like this:
"password1" = "kwzwe.fMQXsCU"
Once you have that you just need to paste the password into the file or copy it by hand if you do not have cut-and-paste capability.
Another method of creating the ".htpasswd" file is with the shell command "htpasswd":
Usage: htpasswd [-c] passwordfile username
The -c flag creates a new file.
Enter the password at both given prompts.
htpasswd -c .htpasswd alice
htpasswd .htpasswd bob
. . .
Once you have created the ".htpasswd" file you should make sure the permissions are set correctly by using the following command:
chmod 644 .htpasswd
The second file required to password protect a portion of your web site is a file called ".htaccess". This file must be placed in the directory to be protected and protects the contents of that directory and any subdirectories nested within that directory. The contents of this file tell the web server where to find the ".htpasswd" file and what form of authentication to apply. In this example I am showing you how to use password authentication to protect a portion of your web space. It is also possible to limit access using groups, or by domain name or address space. The format of the ".htaccess" file is:
<Limit GET PUT POST>
require user alice
This limits the access to this directory to only 'alice'; although 'bob' and 'charlie' have passwords, too, they can't access this directory. To allow anyone with a password to gain entrance you can use:
. . .
<Limit GET PUT POST>
Substitute the path of your home directory for "/home/login". Case is important in the above file, for example the "GET PUT POST" must be UPPER case, the "Limit" must have an upper case "L" and lower case "imit". Placing all three limiters (GET, PUT, and POST) there ensures that all access to that directory will be protected, rather than any one specific access with others unlimited. The keywords in the above file should be up against the left column with no spaces on the line prior to the keywords.
Set the permissions so that the .htaccess file is publically readable:
chmod 644 .htaccess
Now when you attempt to access the URL corrosponding to the directory the ".htaccess" file is located in, your browser will prompt you for a username and password. If you supply the correct password you will be permitted to access the material requested, otherwise access will be denied.
Blocking Access by IP
There are situations where you may want to block (or allow) access based on the hostname or IP address of the browsing user rather than use a password method. For instance, if someone out there is flooding your logs with all sorts of errors trying to find pages without using your links, etc., you can block access to their site without blocking everyone else. Or, if you want your workstation at work to be able to access a directory but no-one else, you can set the server to allow it from your machine and deny it for everyone else. The trick lies in the same "<Limit GET PUT POST>" section, but with different limiting rules.
To deny access to a set of addresses (sample address of "192.168.12.34"), while allowing access from everywhere else, the pattern is:
<Limit GET PUT POST>
allow from all
deny from 192.168.12.34
Similarly, to allow connections from a limited set of addresses (again with our sample of "192.168.12.34"); note that the "order" changes as well:
<Limit GET PUT POST>
deny from all
allow from 192.168.12.34
These limits can also be done with hostnames or patterns as well. For instance, a directory specifically for Boeing employees might use:
<Limit GET PUT POST>
deny from all
allow from .boeing.com
Alternate Error Pages
Another line you can have in your ".htaccess" file will replace our default "404 - Not Found" page or the empty "401 - Authentication Failed" screen. This can be done to keep a similar look and format across an entire set of pages. The format of this line is shown here to replace both error pages (remember to replace "login" with your own login name):
ErrorDocument 401 /~login/401.html
ErrorDocument 404 /~login/404.html
Only 400-series errors can be redirected in this manner. Replace your login name in the lines as well as the name of the file(s) you've created for the particular error page. Remember that the error pages will need proper permissions (644) themselves as well. This will change the referenced error outputs for the directory including the ".htaccess" file as well as all subdirectories.
Hosting a Domain
Now that you have a website up on our server, you might want to have your own domain name ("http://www.something-of-your-own.com/") rather than listing it as a user site on our main server's name ("http://www.eskimo.com/~yourname"). For information on how to do this (registering a domain, sending in needed payments to us for domain hosting, etc.), please see our page about "Virtual Domain Hosting".
Search Engine Optimization
The single most important factor in getting a high search engine ranking is content. Search engines are highly optimized towards finding content relevant to the users search terms. Any attempt to get your page to rank high for keywords that are irrelevant to the content won't be effective. Try to structure your site for a human reader. Incomplete and poorly constructed sentences will count against your sites ranking.
Many search engines include the number of links from external sites to your site in your sites ranking. If many sites link to your site this indicates that your site has interesting content and therefore deserves a higher ranking. Some search engines also factor the number of links from your site to external sites into their ranking. Google tends to weight links more strongly in overall ranking than other search engines and was one of the first if not the first to use links as ranking criteria, hence this has become known as the "Google Factor".
The Google factor has provided incentives for companies to spring up that do nothing but register a huge number of domains, each of which is nothing more than a page of links to sites, and then they include your site in their links for a fee. Search engines have gotten wise to these sites and they likely will not have a positive impact upon your site ranking today. Further, Google and other search engines are starting to associate the content of the linking site with the linked to site and if there is not a close corrospondence between them, it will not help your ranking. The long and short is, don't encourage irrelevant links, don't link to irrelevant material.
Search engines will often rank text that is within headline tags, hilighted, or using a larger text more heavily than regular text or small fonts. If you want to make something stand out to a search engine, make it stand out for your human visitors. The search engines try to act like a human so they can evaluate sites content in a human context, and are getting more and more successful in this respect. Putting keywords in a size 1 font with the same color as the background is not likely to help your sites ranking.
Some search engines ignore Meta tags, others rank ONLY Mega tag data, and still others corrolate the Meta tag data. The best thing to do is to use Meta tags and make sure that any terms in those tags also appear on the body text.
Another place to put keywords is in the "ALT" tags of an image references. The ALT tag displays text in place of images for people who have images disabled or who use a text-only browser such as lynx. Also, some browsers like Mozilla will bring up a little window with the content when you mouse over a image link with an ALT tag. Some search engines will index these tags so they represent another opportunity to make relevant keywords visible to search engines.
If you have your own domain, be sure to create a robots.txt file in the root directory so that search engines understand they have permission to index your files. Also be sure to specify a character set. Some search engines will not index content with an undefined character set.
A couple of additional important items are how fast your page comes up and technical correctness of your web page. We've done everything humanly possible to make your webpage come up fast. But ultimately the size of the page is still a factor in how long it takes for a search engine to retrieve it. As a general rule I would suggest trying to limit the size of individual pages to around 100 kbytes. This will give a load time of around 20 seconds on a 56k modem. Most search engines will not index past this point anyway. So if you have too much content to fit on one page it's better to break it up into several linked smaller pages.
I'm only giving a cursory treatment of the subject of Search Engine Optimization here. For in depth information as well as an excellent site analysis tool which we use I would highly recommend the services of Scrub The Web. They're inexpensive and provide much more extensive information regarding web site promotions, search engine optimization, and have tools that find and allow you to fix technical problems with your site and optimize it.
Subject URL A Beginners Guide to HTML W3Schools HTML Tutorial CGI Programming CGI Programming FAQ HTML quick reference W3Schools HTML 4.01 / XHTML 1.0 Reference