Reducing your Page Weight
Image Credits : Rafael Pealoza
Page weight has always been an issue for web pages for the very beginning due the slow data transfer rate.This is kind of like obesity in human world, a man-made issue that worsened with time, In web web developers are to blame without any excuses.
The page weight increased by 32% in 2013 and reached 1.7Mb mark and makes up-to a 100 individual HTTP requests. The numbers are higher in many cases, it’s growing and is become a serious matter to look into.
Consequences of having an overloaded / overweight website can be observed from the bottom line and the effect ranges to the very top. A few signs are:
- The larger the download size, the slower the user experience becomes: Not every one has a 20MB/s Internet connection, there are still countries working on the copper cables.
- Poor Mobile performances : 4G is still in dreams for many mobile users, with 3G and lower networks a 1MB web page can take any where between a Minute to one & half, however you design the UL the users won’t wait.
- Excess Bandwidth use for both client and yourself : The extra load can eat a lot of bandwidth, at least for your user. Surfing on the expensive High speeds data its definite that a user will think twice before using your service. With data’s coming from different servers it wont show a big effect in server areas, but it will. An small optimization will cut a lot of money.
Unfortunately there is no straight froward solution for this obesity other than regular exercise & following a balance diet. A Couple of techniques that can get the fat fellow down are:
- Load what is required
- Compressing Data & Caching
Compressing is an effective way to get the size reduced, because the time required in compressing and decompressing the data normally stays below the time required for transferring the data uncompressed over the network.
Caching is to store the files in browser if it supports it, so it need not be downloaded again. This doesn’t applies for a first time visitor but it definitely reduces the load time by half the next time. A few way to achieve the following are
Content Delivery Networks can be used to Mirror the data for faster data transfer.
Unused assets, widgets, JavaScript, CSS file or any thing that is not required in the page must stay in server and not to be transferred to client. This can get the size down by a fair amount.
Concatenate and minify CSS : The above case does not work is most case may be you don’t have any unused files or you use a package that cannot be used independently without the other, in either case you case compress it up removing the extra spaces in the files and merge the similar file types to one to reduce the number of HTTP requests made.
Load the element on right time, Make sure to load the CSS before the page rendering is begun so the browser doesn’t have to restyle the page. If your JavaScript just makes user experience better, and the website is in usable form without it better load them at the end.
Compressing the images using compression tools will get their size down, there is no question that contributions of images to web page size is always high, resizing them can produce a significant effect on page weight. lazy-loading or on-demand load can used if you are not concerned about SEO.
Web fonts did revolutionized web industry allowing font usage event without the font being installed on the machine, because they are downloaded every time they do eat some bandwidth so limit the font usage to one or two per website.
Still more techniques can be used like changing to Font icons, removing social buttons and more with all these optimization done there is no meaning without measuring them.
- Pingdom : Pingdom is one the best in the industry. It reveals everything you could possibly measure starting from page weight, download speed, code analysis, performance grades, development suggestions to historical time line to recording of your progress from the first time you analyzed it.
- Mozilla Web Developer Add-on : this toolbar has been around since the dawn of creation Firefox, thanks to Chris Pederick. I can show you the size of the web page compressed, uncompressed and much more.
- Google PageSpeed Insights
- YSlow
- GTmetrix can analyze same website with YSlow and Google Pagespeed and provide the results in a single panel.
Website size depends mainly depends on the type of service you are providing, the amount of optimization depends on the nature of audiences. If you have a image serving service then you have to give the user with high resolution images, serving compressed images makes no scene. At the same time for a content rich website you need to be serving contents at priority than the images.
I wish this rough checklist can help you to get the your fat guys size down.