Welcome!

Eclipse Authors: Pat Romanski, Elizabeth White, Liz McMillan, David H Deans, JP Morgenthal

Blog Feed Post

Page Speed v1.0 VS YSlow v2.0

Last week Google announced a new open source plug-in for Firefox called Page Speed which can be used to assess the performance of web pages and make recommendations on speeding up the delivery.  To me this sounded a lot like Yslow, of course that plug-in was developed and maintained by Yahoo! so it does make sense that Google would want to come up with their own version.  As I hadn't yet upgraded to YSlow 2.0 I figured this was a good time to see what the new version had to offer and how Page Speed compares. All testing was conducted on the DevCentral home page with a unpopulated cache.

The first problem I encountered with both is that I currently am using Firefox 3.5 beta and Firebug 1.4 neither of the two add-ons currently provide support for these beta releases.  This meant reverting back to Firefox 3.0 and Firebug 1.3.3.  The next incompatibility I found was with Page Speed and HttpWatch.  These two add-ons are considered incompatible when it comes to the Page Speed Activity tab. 

 

imageThe warning message is rather vague and not to be deterred I went ahead and tested what would happen if I imageran the two together.   I was able to run both the Page Speed Activity report and HttpWatch at the same however the resulting graphs from Page Speed were illegible.  The two add-ons when run at the same time provided different response time results Page Speed said the page took 16 seconds while HttpWatch showed only 14.4 seconds.

YSlow on the other hand does not present any incompatibility warning message and the response time results are quite similar 12. 735 seconds for HttpWatch and 12.744 seconds for YSlow.  

 

 

On of the nicest new features of YSlow 2.0 is the ability to customize the rules that are applied.  There are 3 default rulesets YSlow (v2), Classic, and Small Site or Blog.  The Classic ruleset provides 13 rules while version 2 has added 9 additional rules.  I'm a little confused by the Small Site or Blog option as some of the rules that have been de-selected like using an expires header should apply to any site.  Any  of the default rule sets can be customized to add or remove a rule, for example with DevCentral we get an F for use a eContent Delivery Network (CDN) because YSlow doesn't pick up the fact that we have built our own mini-CDN with acceleration solutions from F5 Networks

Now onto some of the recommendations that are provided. 

Browser Caching

For repeat visitors the use of Cache-Control or Expires headers are very useful.  Where YSlow  and Page Speed differ is in the recommendation for how long the content should be cached for.  YSlow recommends 2 days while Page Speed suggests 30 days, I have to say I agree with YSlow on this point.  If there is no way for a browser to recognize if the content that has been cached has changed then a static 30 days expiration may be result in stale content being served to end users.  To use an expiration like 30 days the development process has to change as well to include some sort of versioning to prevent the browser for serving stale content.  Changing the development process isn't always possible which is why a shorter expiration may be needed.  One of the down-sides to YSlow is that they use a default value of 48 hours for the cache if objects are cached for less than this time points are deducted and a low score is provided.   Even though DevCentral uses an expiry between 2 days and 180 days 27 objects are flagged by YSlow as not having a far-future expires.  The items that are flagged are those having a 2 day expiry cache-control headers are set to public, s-maxage=14400, max-age=172800, I believe what is happening here is that between the item being cached and YSlow running the analysis just enough milliseconds have passed to have the content be just under 2 days and therefore trigger the warning. It would be nice if you could configure what was an appropriate expires for your business and have the tools use that value instead of having a value imposed on you. 

Compression

Using GZIP is a great optimisation technique and should be used whenever possible.  While there are about 20 text files on the DevCentral homepage there are 2 that are not compressed which is picked up and flagged by both tools.  It is unfortunately not always possible to compress every single text file.  I have spent hours with customers trying to debug why an application doesn't work when compression is enabled, in the end it turns out to be a single script file that when compressed causes the page functionality to break, disabling compression for the script in question resolves the problem.  If there are multiple compressed files on a page and only 1 or 2 that aren't compressed there may be a reason for it.

Reduce DNS Lookups

Page Speed and YSlow both recommend reducing the number of DNS lookups and have identified the domains the content is coming from.  YSlow provides a score of A even though there are 5 domains.  Page Speed suggests eliminating some of these items - unfortunately that can't be done as these are third party resources that are used for various analysis and tracking - one of which just happens to be from Google.  It seems like YSlow is a little smarter in identifying what these are for and that they can't be reduced.  This  recommendation is actually a little puzzling to me as one way to speed up page download is to spread the requests across multiple domains to allow more items to download in parallel.  Page Speed provides this as a recommendation as well but doesn't this contradict the recommendation to reduce DNS lookups.

 

Generally speaking both tools are valuable in understanding where a site may be slow and where improvements can be made but make sure to investigate whether or not the recommendations work for your application.  There may be reasons why things are the way they are.  It will be interesting to see how these continue to evolve. 

Read the original blog entry...

More Stories By Dawn Parzych

Dawn Parzych is a Technical Product Marketing Manager at Instart Logic. Dawn has had a passion for web performance for over 15 years with a focus on how to make the web faster. As a technical product marketing manager at Instart Logic, she researches and writes about trends in the web performance space and how they impact the user experience. Prior to joining Instart Logic, Dawn worked at F5 Networks, Gomez & Empirix.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...