|By Dawn Parzych||
|June 8, 2009 11:22 AM EDT||
Last week Google announced a new open source plug-in for Firefox called Page Speed which can be used to assess the performance of web pages and make recommendations on speeding up the delivery. To me this sounded a lot like Yslow, of course that plug-in was developed and maintained by Yahoo! so it does make sense that Google would want to come up with their own version. As I hadn't yet upgraded to YSlow 2.0 I figured this was a good time to see what the new version had to offer and how Page Speed compares. All testing was conducted on the DevCentral home page with a unpopulated cache.
The first problem I encountered with both is that I currently am using Firefox 3.5 beta and Firebug 1.4 neither of the two add-ons currently provide support for these beta releases. This meant reverting back to Firefox 3.0 and Firebug 1.3.3. The next incompatibility I found was with Page Speed and HttpWatch. These two add-ons are considered incompatible when it comes to the Page Speed Activity tab.
The warning message is rather vague and not to be deterred I went ahead and tested what would happen if I ran the two together. I was able to run both the Page Speed Activity report and HttpWatch at the same however the resulting graphs from Page Speed were illegible. The two add-ons when run at the same time provided different response time results Page Speed said the page took 16 seconds while HttpWatch showed only 14.4 seconds.
YSlow on the other hand does not present any incompatibility warning message and the response time results are quite similar 12. 735 seconds for HttpWatch and 12.744 seconds for YSlow.
On of the nicest new features of YSlow 2.0 is the ability to customize the rules that are applied. There are 3 default rulesets YSlow (v2), Classic, and Small Site or Blog. The Classic ruleset provides 13 rules while version 2 has added 9 additional rules. I'm a little confused by the Small Site or Blog option as some of the rules that have been de-selected like using an expires header should apply to any site. Any of the default rule sets can be customized to add or remove a rule, for example with DevCentral we get an F for use a eContent Delivery Network (CDN) because YSlow doesn't pick up the fact that we have built our own mini-CDN with acceleration solutions from F5 Networks.
Now onto some of the recommendations that are provided.
For repeat visitors the use of Cache-Control or Expires headers are very useful. Where YSlow and Page Speed differ is in the recommendation for how long the content should be cached for. YSlow recommends 2 days while Page Speed suggests 30 days, I have to say I agree with YSlow on this point. If there is no way for a browser to recognize if the content that has been cached has changed then a static 30 days expiration may be result in stale content being served to end users. To use an expiration like 30 days the development process has to change as well to include some sort of versioning to prevent the browser for serving stale content. Changing the development process isn't always possible which is why a shorter expiration may be needed. One of the down-sides to YSlow is that they use a default value of 48 hours for the cache if objects are cached for less than this time points are deducted and a low score is provided. Even though DevCentral uses an expiry between 2 days and 180 days 27 objects are flagged by YSlow as not having a far-future expires. The items that are flagged are those having a 2 day expiry cache-control headers are set to public, s-maxage=14400, max-age=172800, I believe what is happening here is that between the item being cached and YSlow running the analysis just enough milliseconds have passed to have the content be just under 2 days and therefore trigger the warning. It would be nice if you could configure what was an appropriate expires for your business and have the tools use that value instead of having a value imposed on you.
Using GZIP is a great optimisation technique and should be used whenever possible. While there are about 20 text files on the DevCentral homepage there are 2 that are not compressed which is picked up and flagged by both tools. It is unfortunately not always possible to compress every single text file. I have spent hours with customers trying to debug why an application doesn't work when compression is enabled, in the end it turns out to be a single script file that when compressed causes the page functionality to break, disabling compression for the script in question resolves the problem. If there are multiple compressed files on a page and only 1 or 2 that aren't compressed there may be a reason for it.
Reduce DNS Lookups
Page Speed and YSlow both recommend reducing the number of DNS lookups and have identified the domains the content is coming from. YSlow provides a score of A even though there are 5 domains. Page Speed suggests eliminating some of these items - unfortunately that can't be done as these are third party resources that are used for various analysis and tracking - one of which just happens to be from Google. It seems like YSlow is a little smarter in identifying what these are for and that they can't be reduced. This recommendation is actually a little puzzling to me as one way to speed up page download is to spread the requests across multiple domains to allow more items to download in parallel. Page Speed provides this as a recommendation as well but doesn't this contradict the recommendation to reduce DNS lookups.
Generally speaking both tools are valuable in understanding where a site may be slow and where improvements can be made but make sure to investigate whether or not the recommendations work for your application. There may be reasons why things are the way they are. It will be interesting to see how these continue to evolve.
As more intelligent IoT applications shift into gear, they’re merging into the ever-increasing traffic flow of the Internet. It won’t be long before we experience bottlenecks, as IoT traffic peaks during rush hours. Organizations that are unprepared will find themselves by the side of the road unable to cross back into the fast lane. As billions of new devices begin to communicate and exchange data – will your infrastructure be scalable enough to handle this new interconnected world?
Oct. 13, 2015 05:00 PM EDT Reads: 121
This week, the team assembled in NYC for @Cloud Expo 2015 and @ThingsExpo 2015. For the past four years, this has been a must-attend event for MetraTech. We were happy to once again join industry visionaries, colleagues, customers and even competitors to share and explore the ways in which the Internet of Things (IoT) will impact our industry. Over the course of the show, we discussed the types of challenges we will collectively need to solve to capitalize on the opportunity IoT presents.
Oct. 13, 2015 04:30 PM EDT Reads: 115
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
Oct. 13, 2015 04:00 PM EDT Reads: 728
SYS-CON Events announced today that Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, will keynote at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Oct. 13, 2015 03:15 PM EDT Reads: 257
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in high-performance, high-efficiency server, storage technology and green computing, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and Embedded Systems worldwide. Supermi...
Oct. 13, 2015 01:45 PM EDT Reads: 229
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Oct. 13, 2015 01:00 PM EDT Reads: 342
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
Oct. 13, 2015 01:00 PM EDT Reads: 338
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal an...
Oct. 13, 2015 01:00 PM EDT Reads: 225
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll share tips on how to speed up business initiatives, harness Big Data and remain one step ahead by apply...
Oct. 13, 2015 12:00 PM EDT Reads: 747
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
Oct. 13, 2015 12:00 PM EDT Reads: 303
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
Oct. 13, 2015 12:00 PM EDT Reads: 412
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
Oct. 13, 2015 12:00 PM EDT Reads: 693
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
Oct. 13, 2015 11:00 AM EDT Reads: 305
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Oct. 13, 2015 11:00 AM EDT Reads: 320
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Oct. 13, 2015 07:00 AM EDT Reads: 6,012
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
Oct. 13, 2015 04:00 AM EDT Reads: 683
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Oct. 13, 2015 03:00 AM EDT Reads: 262
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
Oct. 13, 2015 03:00 AM EDT Reads: 410
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, will introduce the technologies required for implementing these ideas and some early experiments performed in the Kurento open source software community in areas ...
Oct. 13, 2015 12:45 AM EDT Reads: 854
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line loads.
Oct. 13, 2015 12:00 AM EDT Reads: 277