Diagnosing Server Slow Downs

Posted: December 28th, 2010
Filed under: Web Speed and Performance


If your web server is experiencing performance slow downs, and in particular if the average load time of your pages seems to be increasing, a good first place to check for what might be causing the slowdown would be your IIS logs. Read the rest of this entry »

No Comments »

List of Free Site Performance Tools

Posted: May 3rd, 2010
Filed under: IIS & HTTP, Web Speed and Performance
Tags: , , , ,


If you are a site owner, webmaster or a web author, here are some free tools that you can use to evaluate and improve the speed of your site:

Firefox / Firebug Add-ons:

  • YSlow, a free tool from Yahoo! that suggests ways to improve website speed.
  • Page Speed, a free tool from Google that evaluates the performance of web pages and gives suggestions for improvement.
  • Hammerhead – a free tool to measure the load time of web pages.

Online Testing:

Development Tools:

  • CSS Sprite Generator – Generates a CSS sprite out of a number of images.
  • Smush It – Online tool that allows you to upload images for lossless compression and optimization. Provides a report of bytes saved and downloads a zip file containing the optimized versions of the files.

Additional Script:

  • In Google Webmaster Tools, Labs > Site Performance shows the speed of your website as experienced by users around the world.

Looking for more free tools? Check out Google’s list.

No Comments »

Bigger Isn’t Always Better

Posted: May 3rd, 2010
Filed under: IIS & HTTP, Web Speed and Performance
Tags: , , , , ,


Consider your image sizes for better site performance

Graphics and multimedia can take up more than 60% of the total file size of an average web page, so graphics optimization matters. Not only should the size and file type of your images be taken into consideration, but how fresh they are as well.

Image Caching

Minimizing round trips over the Web to revalidate cached items can make a huge difference in browser page load times. Perhaps the most dramatic illustration of this occurs when a user returns to a site for the second time, after an initial browser session. In this case, all page objects will have to be revalidated, each costing valuable fractions of a second and blocking actual new content from using the browser’s available network connections (not to mention consuming bandwidth and server cycles). Read the rest of this entry »

No Comments »

Google Steps up the Pace

Posted: May 3rd, 2010
Filed under: IIS & HTTP, Web Speed and Performance
Tags: , , ,


Google Adds Speed Algorithm, Effects Site Rankings in Search

For a long time now Google has pushed the importance of speed in their products and on the web. Just last month Google has announced that they will be including a new trigger in their search ranking algorithms: site speed.

“Speeding up websites is important — not just to site owners, but to all Internet users. Faster sites create happy users and we’ve seen in our internal studies that when a site responds slowly, visitors spend less time there. But faster sites don’t just improve user experience; recent data shows that improving site speed also reduces operating costs. Like us, our users place a lot of value in speed — that’s why we’ve decided to take site speed into account in our search rankings. We use a variety of sources to determine the speed of a site relative to other sites.” Read the rest of this entry »

No Comments »

Turbo Charging Your Web Server

Posted: March 29th, 2010
Filed under: IIS & HTTP, Web Speed and Performance
Tags:


A good place to turn in order to speed up your site is the server software and hardware itself.

Starting with software, it is fairly unlikely that Web administrators are going to quickly dump IIS for Apache or Apache for IIS, even when security, ease of use, or performance is cited as a reason to leave one camp for another. Put simply, the Web server and its associated operating system are often so intertwined with the Web site(s) that migration becomes an onerous and even risky task. Read the rest of this entry »

No Comments »

5 Markup Optimization Tips

Posted: March 17th, 2010
Filed under: Web Speed and Performance
Tags: ,


Typical markup is either very tight, hand-crafted and standards-focused, filled with comments and formatting white space, or it is bulky, editor-generated markup with excessive indenting, editor-specific comments often used as control structures, and even redundant or needless markup or code. Neither case is optimal for delivery. The following tips are safe and easy ways to decrease file size: Read the rest of this entry »

No Comments »

Pouring Cache Flavored Candy

Posted: March 16th, 2010
Filed under: IIS & HTTP, Web Speed and Performance
Tags: , ,


Sometimes visualizing how cache works can be difficult for some, here’s an illustration that may help.

Cache Control : The Hourglass Illustration

For this illustration you will need to visualize the path from your server to your client’s computer via the internet as a large hourglass. Data needs to be sent back & forth through the hourglass constantly. Pouring 1 lb. of M&Ms through, and back and forth, takes a long time. But, assume that brown M&Ms are unchanging, while red, green, etc. may change flavor. If you can leave all the brown ones on the browser side – drain them from the hourglass – then the remaining ½ lb. of other colors can be sent back and forth much more quickly. Read the rest of this entry »

No Comments »

HTTP Compression: Compressing Files, How Low Can You Go?

Posted: February 23rd, 2010
Filed under: IIS & HTTP, Web Speed and Performance
Tags: , , , , , , , ,


HTTP compression on IIS is easy to enable with tools such as httpZip or ZipEnable and requires no client-side configuration to obtain benefits, making it a very smart way to get extra performance and a better user experience.

It’s well known that there is a limited amount of bandwidth on most Internet connections and anything IT administrators can do to accelerate site load time benefits not only the organization, but users as well. HTTP compression, a function built into both browsers and servers, can substantially improve site performance by reducing the amount of time required to transfer data between the server and the client. When data is encoded using a compressed format like GZip or Deflate, it introduces complexity into the HTTP request/response interaction by necessitating a type of content negotiation. This content negotiation communicates with the browser, deciding if it can or cannot handle the compressed data and sends the appropriate version of the resource to the browser.

Read the rest of this entry »

No Comments »

Tally Ho and Onward to 2010

Posted: December 15th, 2009
Filed under: IIS & HTTP, Web and Application Security, Web Speed and Performance


2009 has proven to be a busy year of product development for Port80 Software, and we don’t see 2010 being any less productive. We have launched major upgrades and improvements in our tools such as added IIS 7/7.7 support to both CacheRight, our popular caching program, and LinkDeny, our easy to use anti-hotlinking tool.  We’ve also seen point upgrades filled with new features and usability improvements to httpZip, ServerMask, ZipEnable, and ServerDefenderAI.

The future looks bright at Port80 with a major httpZip update with IIS 7/7.5 and Windows 2008 support in early 2010. ServerDefenderVP, our new powerful Web Application Firewall is in its final shakedown as you read this.  By the time you are back to work it might be already out or very shortly after, in time for you to deal with your New Year’s resolution to really lock your Web site/application down.

We look forward to continuing to provide our customers with the professional tools they have come to expect and the device compatibly they can use. We wish all of our clients a happy holiday and a very productive and profitable New Year.

Thank you for your continued patronage.

From all of us at Port80

No Comments »

What is Cache Control?

Posted: November 16th, 2009
Filed under: Web Speed and Performance
Tags: , ,


What is Cache Control?

Cache control is not to be confused directly with caching itself, which is generally a special high-speed storage mechanism that can be either a reserved section of main memory or an independent high-speed storage device. Cache control is also different from caching in ASP.NET or PHP Web development technologies that focus on the pre-generation of DB queries so dynamic pages load faster for browsers.  Read the rest of this entry »

No Comments »

Spend Your Cache Wisely

Posted: November 16th, 2009
Filed under: Web Speed and Performance
Tags: , , ,


Spend Your Cache Wisely
Understanding Caching and Cache Control

Cache, not to be confused with “cash”, isn’t something to be spent down at the local market or every time someone loads your page for that matter. Effective cache control involves managing the freshness and frequency of your page loads. The basic idea behind caching is simple. Instead of wasting efforts by re-downloading a resource every time it is needed, keep a local copy, and reuse it for as long as it is still valid. Read the rest of this entry »

No Comments »

HTTP Compression : Smaller, Faster… Better

Posted: August 17th, 2009
Filed under: IIS & HTTP, Web Speed and Performance
Tags: , , , ,


What is HTTP compression and how does it work?

HTTP compression is a long-established Web standard in which a GZip or Deflate encoding method is applied to the payload of an HTTP response, significantly compressing the resource before it is transported across the Web.

When data is encoded using a compressed format like GZip or Deflate, it introduces complexity into the HTTP request/response interaction by necessitating a type of content negotiation. Whenever compression is applied, it creates two versions of a resource: Read the rest of this entry »

No Comments »

Tap… Tap… Is This Thing On?

Posted: August 17th, 2009
Filed under: IIS & HTTP, Web Speed and Performance
Tags: , , ,


How can I tell when compression is working?

Our free online Compress Check tool is a good place to start. However, one of the best ways to answer this question is to look at the conversation between a browser and a server at the HTTP level. Since browsers generally do not display protocol-level data, you will need a tool that lets you see what the browser usually keeps hidden.

Fortunately, there are a number of options: Read the rest of this entry »

No Comments »

Online Screencasts Are Here

Posted: January 8th, 2009
Filed under: Web and Application Security, Web Speed and Performance
Tags:


We know, you’ve all been waiting for this moment… We here at Port80 have pulled together some overview Screencasts of our products, including the most recent release of ServerMask 4.1.1, not yet a month old!

Check them out at www.port80software.com/screencasts

Cheers!

Jenny @ Port80

No Comments »

Longest. Download. Ever.

Posted: May 21st, 2008
Filed under: IIS & HTTP, Web Speed and Performance


Given latency and constrained connections out there, if you ever receive this download warning, make sure you have lots of food and water stored up for the wait:

http://www.w3schools.com/downloadwww.htm

There are some files even HTTP compression and caching cannot help!

Cheers,
Port80

1 Comment »

Webinar on Web Performance Solutions (Microsoft IIS and More — Archived Version)

Posted: May 21st, 2008
Filed under: Web Speed and Performance
Tags: ,


Learn how to send less data, less often in this new archived webinar from Port80 Software!

Port80 Software invites you to view a webinar on Web performance solutions… We recently reviewed the Web acceleration market, where Port80 Software solutions like httpZip, ZipEnable, CacheRight, and w3compiler come to play on IIS Web server performance, and how to analyze HTTP compression and Expiry Cache Control solutions with HTTP analysis tools.

Agenda — Web Performance Solutions for Microsoft IIS Web Servers:

– Common Web Performance Challenges
– The Web Acceleration Market
– Analyzing the HTTP Request/Response Cycle
– HTTP Compression
– Expiry Based Cache Control
– Questions and Answers…

Login to Access the Archived Webinar:

– Go to https://www119.livemeeting.com/cc/port80/view?id=Web_Performance_Solutions_1.
– Enter your name, the Recording ID if not already entered (Web_Performance_Solutions_1), bypass the Recording Key (this is not required), and then click View Recording.
– You will be presented with two format options on the next screen; we recommend choosing the second version (“meeting replay), as the first version does not include video.
– Click the Windows Media icon under ‘View’ for the Microsoft Office Live Meeting Replay version, and this will launch the Webinar in Windows Media Player (http://www.microsoft.com/windows/windowsmedia/default.mspx).
– If you have any trouble logging into the Webinar, please just ask for help at support@port80software.com.

Thanks for watching, and please let us know if you have a topic for our next webinar event or if we can answer any questions on IIS performance!

Best regards,
Port80 Software

No Comments »

HTTP Compression and the Google AdSense Crawler Bot

Posted: February 22nd, 2008
Filed under: Web Speed and Performance
Tags:


FACT: HTTP Compression really improves Web serving.

FACT: Big sites like Google and Yahoo! use compression.

UNFORTUNATE FACT: Some services are not aware enough of compression and may break… unless you have a smart compression engine!

This underutilized technology transparently reduces the size of all text-based content served from a Web site or Web service, speeding up transmission across the Web, reducing bandwidth expenses, and freeing up Web server availability to handle more requests. Compression deployments are accelerating among business sites, and Google.com has been compressing responses for a long time (see this real-time report: http://www.port80software.com/tools/compresscheck?url=www.google.com).

Google’s Googlebot, their Web crawler that indexes sites to form the basis of search results, also likes to see compressed content. At a search engine conference a few years back, search guru Danny Sullivan spent some time focusing on this: Google only indexes so much of a page, so if you send the Googlebot compressed content (which it asks for by the presence of the “accept-encoding: gzip, deflate“ header in a request), you can theoretically get more content indexed and save bandwidth on that request from Googlebot and all other requests to IE, FireFox and other browsers and search bots with HTTP compression. Very cool.

It is ironic then, given Google’s knowledge and use of HTTP compression, that Google’s AdSense program, which sells contextual advertising on third party sites, use technology that is not compatible with HTTP compression. One of Port80 Software’s httpZip compression clients received this email recently from Google’s AdSense team in response to why the Port80 client’s contextual ad site was not getting index by the AdSense crawler bot program (which goes by the user-agent name starting with “mediapartners-google”; a user agent is the Web client’s name, usually a browser or bot)… this is part of the email from a Google AdSense rep to our client:

“I’ve reviewed your site and have determined that our crawler is having difficulty accessing your URL. Specifically, your webserver is sending our crawler HTML in a compressed format, which our crawler is unable to process.

We recommend that you speak with your web administrator to ensure your system does not send our crawler compressed data. You can determine our crawler by looking for user agents starting with ‘Mediapartners-Google’.

Additionally, please be aware that after you have turned off the encoding, it may be 1 or 2 weeks before the changes are reflected in our index. Until then, we may display less relevant or non-paying public service ads. You should expect your ad relevance to increase over time.”

So, the AdSense crawler bot does not like HTTP Compression. But the real question is — why are they asking for it? In the request to get compression from any Web server, a user agent must first have that “accept-encoding: gzip, deflate” header in the original request… if the AdSense bot cannot deal with compression, it should not be requested by the bot itself. That makes sense, right?

It looks like Google AdSense is asking clients to not compress responses to their bot to fix this issue, rather than fixing the decompression bug (an educated guess) in their bot code. So, the fix for now if you have a Web server, are in the AdSense program from the serving side (you host Google AdSense ads on your own site), and still want to use compression for all other Web visitors, an exception must be made for any request with a user-agent starting in “mediapartners-google”.

Unfortunately, you cannot do this on Microsoft IIS 4 or 5 servers (NT or 2000) without a third party compression tool like httpZip from Port80 Software that can add a compression exclusion for a user agent. On IIS 6 (Windows 2003), you can use httpZip or ZipEnable to add such an exception or exclusion. We will be adding the default exception for this browser to a minor version upgrade of both products soon, but here is how to add an exception for this AdSense bot with httpZip and ZipEnable.

Excluding Google’s AdSense Bot IIS Compression with httpZip:

– Install the free httpZip trial from www.httpzip.com/try.

– Once installed, confirm compression is working fine (http://www.port80software.com/products/httpzip/evaluation).

– Open the httpZip Settings Manager.

– On the compression tab, to add a new Browser Exception for a MIME type, select “New” and, in the Add Browser Exception dialog, enter a Browser Name (like “AdSense Bot”) for the browser in the text box labeled “Browser Name.” Next, enter the search string text used to identify the browser (use “mediapartners-google” to get all versions of the bot, this short version will wildcard for specific software versions of the bot) in the text box labeled “Search String”, then click OK. Please note: you will have to add this for the MIME types being requested by the bot, which should include “text/html”, “text/css”, “text/javascript”, and “application/x-javascript” MIMEs, and probably a few more, based on what you are serving and want to get indexed.

Picking a MIME (text/html) to Exclude the AdSense Bot from compression

Setting up the AdSense Bot Exception for text/html MIME

– Apply your settings in the httpZip Settings Manager. Repete proces for other MIMEs that you want to get indexed (FYI, text/html should take care of most dynamic content output from ASP, ASP.NET, CFM, PHP, JSP, etc. files).

– You can use Wfetch, a free tool in the IIS 6 Resource Kit, to test that no responses will compress when requested by the AdSense bot (http://support.microsoft.com/kb/840671). Just add these headers to a request in Wfetch (“accept-encoding: gzip, deflate”), and the response from the server with the new httpZip exclusion will not be compressed (it should have no headers like “content-encoding: gzip” or “content-encoding: deflate” in the response from the Web server and is therefore not compressed).

– All your other requests from good browsers and bots will now be compressed while you can feel safe that you are not messed up with the Google AdSense bot. Remember, it may take a few weeks for the AdSense bot to reindex your site correctly.

You can add an exclusion to compression requests from the AdSense bot on IIS 6 with ZipEnable by following the instructions above and adding an exclusion directly in ZipEnable — here is the documentation for that process in ZipEnable (http://www.port80software.com/products/zipenable/docs#adv_set_browser). You will also want to use something like Wefetch that will allow you to alter your request headers so you can trick out the user-agent and make sure you are getting no compression when the user agent includes “mediapartners-google*” (make sure the search string is a wildcard implictly in ZipEnable , a bit different than in httpZip: “mediapartners-google*” ).

We hope this helps clear up any confusion on Google AdSense and HTTP compression – please contact us for help here and for other tips on IIS performance boosts!

Best regards,
Port80 Software

1 Comment »

Give Images a Tighter Shave

Posted: December 8th, 2004
Filed under: Web Speed and Performance


When attempting to optimize Web site performance, it is obvious to look towards dependent objects like images and other media files like Flash for the bulk of the download (and thus transfer delay). Today’s Web developers are familiar on using Photoshop or Fireworks to optimize images — for example, reducing the colors in GIF images or reducing the % quality in JPEG images — but even after doing this, there is often junk lying around in the file. For example, in JPEG images, extra information like

– Digital camera technical data (EXIF)
– Comment and application blocks
– Thumbnail information

may add more data to our your image file/picture than most human and normal end users would care about. For example, when we stripped some of the images being used on our site (noticeable savings, though not enormous savings), we shaved about 2.5K off one image (17Kb to 14.5Kb) just by running it through an image stripper like PureJPG:

http://www.yafla.com/papers/purejpeg/filter_unnecessary_jpeg_info_such_as_exif.htm

While this is not tremendous savings, every little bit helps, and since it does not distort the image at all, we have already speced this feature for an upcoming release of w3compiler. Some folks might ask why we don’t do image crunching like this in our server products: we could, but the performance impact can be significant. However, we’d be happy to hear from anybody who thinks this approach makes sense for httpZip as well…

No Comments »

Image Caching: Easy and Overlooked

Posted: December 1st, 2004
Filed under: Web Speed and Performance
Tags:


The new caching features in ASP.NET are talked about a lot and that’s because .NET developers now have many features to take advantage of to improve the performance of their applications through server-side caching. Things like fragment caching and data caching along with their wide variety of configuration options, give developers all the tools they need to minimize expensive procedures in their dynamic pages, speeding up their applications.

But with all of the smart caching decisions that good .NET developers can make in developing their applications, there are some very easy caching related performance improvements that can be made by anyone with a Web site. And they’re not nearly as complex as server-side caching.

I’m referring to making your files cacheable in the end user’s browser cache. Since web applications serve much more than aspx files, most of them static ones like images and many css and large script files, there is no reason why these should be sent without the cache related HTTP headers to get these cached in the user’s browser.

Compared to the decisions that go into implementing ASP.NET server-side caching, devising a client-side page caching policy should be a cakewalk. It might make more sense for a system admin or a web site manager or whoever analyzes the web traffic to come up with the policy since they see which files are commonly requested and re-requested often. Maybe all of these people should contribute. This was actually a big consideration we had when we developed the
CacheRight tool. This tool uses a rules file to handle the cache control management of Web pages so that any of these people can 1) view the entire site-wide policy in one place and 2) manage that policy.

Even a loose and informal policy can make a huge difference. The most basic, yet beneficial, policy is simply to cache your images. The perceived benefit of this is always quite dramatic. Many people looking to improve their Web site’s performance through compression tools and expensive content distribution networks are often most impressed by simply doing this. When the images from your web site are displayed from a user’s cache, without the time delay of an unnecessary trip to your server to check whether the file has changed, they pop into view. When this happens it is noticeable–and with repeat visitors to your web site, this will happen. Ebay religiously uses cache control headers for its images. Outlook Web Access does too.

This isn’t low hanging fruit on the Low-cost Web Site Acceleration tree, this is more like a ripe 20 lb. watermelon that you step over every time you walk through the watermelon patch. There’s just no good reason for not taking advantage of this. It’s too easy. And this simple technique helps to solve many of the problems that send people in search of expensive Web acceleration solutions.

To illustrate, check the last-modified value of your site’s navigation images. They probably haven’t changed since your design firm created them for you more than a year ago. A year’s worth of conditional GET requests and the unnecessary connections opened to make those requests are wasteful when your users have copies in their cache. Just assign these images an expiry time of 24 hours and revisit your web site a couple of times. You’ll see the difference.

No Comments »

Microsoft Flash Training Module: IIS 6 compression and ZipEnable

Posted: November 24th, 2004
Filed under: IIS & HTTP, Web Speed and Performance
Tags:


Microsoft has created some handy Flash-based walk-throughs for HTTP compression in general, IIS 6.0 native compression and also how to use ZipEnable (pretty much covers everything in ZE except browser compatibility…).

These modules are also a great way to get non-technical folks familiar with HTTP compression concepts and how to make it work on Windows Server 2003.

Access the free training modules at http://www.microsoft.com/windowsserver2003/tryiis/training/compression.mspx.

Happy Thanksgiving, folks!

No Comments »