Eclipse Authors: Pat Romanski, Elizabeth White, Liz McMillan, David H Deans, JP Morgenthal

Blog Feed Post

Windows 8 Notifications: Leveraging Azure Storage

When incorporating images into your toast and tile notifications, there are three options for hosting those images:

  1. within the application package itself, using an ms-appx:/// URI,
  2. within local application storage, using an ms-appdata:///local URI, or
  3. on the web using an HTTP or HTTPS URI.

The primary advantage of hosting images on the web is that it insulates the application from changes to those images.  The images hosted on the web can be modified or refined without requiring an update to the application, which both necessitates a resubmission to the Windows Store and relies on users to update the application.

Windows Azure 90-day trialWindows Azure, Microsoft’s public cloud offering, can be an incredibly convenient and cost effective way to manage the images used in notifications. The easiest way to serve image content from Windows Azure is via blob storage, which provides highly scalable and highly available storage of unstructured data at one of eight Windows Azure data centers worldwide. A Content Delivery Network (currently comprising 24 nodes) is also available to improve performance and user experience for end-users that aren’t located near a data center.

You might be surprised how simple it is to set this up… and did I mention it’s free to try?! This post will take you through all the steps, including:

Setting up your Windows Azure Storage Account

  1. Apply for a 90-day free Windows Azure subscription, leverage your MSDN benefits, or select one of the other options for Windows Azure access. You will be prompted for a credit card to verify you’re a carbon-based life form, but the free account options come with no risk, and you will not be charged for any usage (unless you specifically opt in to that).
  2. Create a storage account, which is accessible via a unique URL endpoint such as http://win8apps.blob.core.windows.net and comes with an access key (well, actually two!) which grant administrator level access to that account.    
    Windows Azure storage account in portal

    Windows Azure storage keys

    There are two access keys to enable updating keys without incurring downtime for applications that might be referencing Windows Azure storage. If you set your application (typically it’s a cloud application) to read the keys from the configuration file, you can update that file in place (without bringing down the app) to use the secondary access key, then you can regenerate the primary access key. This provides the capability to have ‘rolling key updates,’ which is a good policy in general to mitigate the impact of compromised storage keys.

Storing Images in the Cloud

Now that your account is set up, it’s time to move some images into it. You can’t do that via the Windows Azure portal, but there are many Windows Azure storage utilities available to manage your blob storage assets, including Cloud Storage Studio (free trial), CloudBerry Explorer (freeware), CloudXplorer (free download), or Azure Storage Explorer (open source).

I’ll use CloudBerry Explorer in this post, but you shouldn’t have any trouble figuring out the analogous steps in your Azure storage manager of choice.

  1. Configure the utility with your storage account and either the primary or secondary access key: Selecting Windows Azure Account within CloudBerry Explorer
  2. Create a new container (call it whatever you like; images seems reasonable), and set the access container policy to allow public access to blobs but not list the container contents. Creating a  new container

    Window Azure blob storage is organized into a two-level hierarchy:

    • containers, which govern the access policy for all of the content within them, and
    • blobs, which are the individual files or other assets being stored.

    You can think of containers like a file folder; however, in blob storage, containers cannot be nested. You can, however, create blob assets with a naming convention that mimics a file path, for instance:

    Blob storage reference components

    Each blob in storage also has an associated content-type that is set when the blob is uploaded (the default is application/octet-stream), so filename extensions that are part the blob name itself aren’t really relevant. Most of the storage utilities do manage the content type for you transparently.

  3. Copy a tile image being used locally over to the container. With Cloudberry this is a simple drag-and-drop in an Explorer-like interface. Copying image from local storage to blob storage in CloudBerry Explorer
  4. With the image file now in the cloud, the code referencing that image would appear similar to the following, which is a modification of sendTileLocalImageNotificationWithXml in the sendLocalImageTile.js file within the App tile and badges sample.     
    // get a XML DOM version of a specific template by using getTemplateContent
    var tileXml = Windows.UI.Notifications.TileUpdateManager.getTemplateContent(
    // get the text attributes for this template and fill them in
    var tileTextAttributes = tileXml.getElementsByTagName("text");
        "This tile notification uses Windows Azure"));
    // get the image attributes for this template and fill them in
    var tileImageAttributes = tileXml.getElementsByTagName("image");
    // content elided
    // create the notification from the XML
    var tileNotification = new Windows.UI.Notifications.TileNotification(tileXml);
    // send the notification to the app's application tile

While this works fine, it requires that the developer accommodate scaling and contrast themes by explicitly requesting the exact image needed, something that is automatically handled by the resource manager when using images stored within the application package.

Windows Azure can manage this for you too though with just a little bit of code, as you’ll see next!

Implementing a Notification Image Server with Windows Azure Web Sites

When using a web URI as the source for a notification image, you have the option to pass additional information via the HTTP GET request in the form of query parameters:

ms-scale (values: 80, 100, 140, 180) indicates the size of the image needed,

ms-lang (values complying with BCP-47) indicating the culture for the request, and

ms-contrast (value: standard, black, or white) referring to the high-contrast modes available.

These query parameters are sent along with the request whenever the addImageQuery attribute is set within the tile or toast template. That attribute can be set on the visual, binding, or image element of the tile schema or toast schema:

// get the image attributes for this template and fill them in
var tileImageAttributes = tileXml.getElementsByTagName("image");
tileImageAttributes[0].setAttribute("addImageQuery", true);

And that would result in a query like:


Of course, Windows Azure storage has no mechanism to interpret the query string, so you’d need to build a service that does so, and then modify the notification template to use the name of the server rather than Windows Azure storage directly. Essentially the service should proxy each of the requests, strip out the query parameters, and make a new server-initiated request for the image best matching the desired scale, contrast mode, and language.

One could build a service like that in a host of programming languages - ASP.NET, PHP, node.js, or anything that can run on Windows Azure (which is anything you can run on Windows!). In this case, I opted for node.js given its lightweight nature and support via WebMatrix.

Building the Node.js Project in WebMatrix

    The first step is to create a new site, which you can do by selecting the Empty Site template for node.js:

      Creating empty node.js site in WebMatrix

    Next, replace the entire contents of server.js with the gist I created (do keep in mind this is not production quality code!).

      Replacing server.js contents with gist code

    In that code, replace STORAGE_ACCOUNT with the name of your Windows Azure storage account (e.g., win8apps is the name of the account I used in the previous example). Optionally, add a list of BCP-47 language codes for which you are providing unique images. To reduce the number of image searches (and resulting storage transactions), I set the algorithm up to consider the ms-lang value only if it's a language explicitly included in the CUSTOM_LANGUAGES array.

    // Azure Storage Account host
    var AZURE_URI = "STORAGE-ACCOUNT.blob.core.windows.net";
    var AZURE_PROTOCOL = "http"; var AZURE_PORT = 80; // only build specific URLs for the following var CUSTOM_LANGUAGES = []; // e.g.: ["en-US", "fr-FR"];

    The implementation assumes that you’ve mimicked a directory structure in Windows Azure blob storage such as the following, where images is the container name (though it can be named anything you like). The links lead directly to my storage account (which will hopefully be active when you read this!). Recall that black, en-US, and fr-FR are virtual directories and don’t actually exist; for instance, the name of the blob corresponding to the last item in the list is really fr-FR/green.scale-180.png.

    The web service works by inspecting the query string passed in with the image request and transforming the URL and that query string into a direct reference to Windows Azure blob storage as follows (you can access the code gist to reconcile line references here)

    1. The Azure Web Site host name is replaced with the AZURE_URI variable defined at Line 20 (use win8apps for STORAGE_ACCOUNT if you want to use the square tile files I set up).
    2. The incoming URL and query string, if there is one, are parsed in Line 103, and an array of candidate URL images is initialized (Line 107).
    3. If there are no query parameters (i.e., addImageQuery was not set in the template), the only candidate URL is the original path with the host name replaced by the Azure storage account host (Line 127).
    4. If the ms-lang parameter specifies a value of interest (i.e., the value is included in the CUSTOM_LANGUAGES array initialized in Line 25)  then four URLs are built (Lines 116 - 119) in the following patterns (the yellow highlighted segments are replaced with the value of the referenced query parameter, the grey highlighted text is literal, and imagePath, imageName, and imageExt are parsed from the incoming URL in Lines 33ff):
      1. imagePath/ms-lang/ms-contrast/imageName.scale-ms-scale.imageExt
      2. imagePath/ms-lang/ms-contrast/imageName.imageExt
      3. imagePath/ms-lang/imageName.scale-ms-scale.imageExt
      4. imagePath/ms-lang/imageName.imageExt
    5. Four additional URLs are also built without the reference to the language parameter (Lines 121 - 127):
      1. imagePath/ms-contrast/imageName.scale-ms-scale.imageExt
      2. imagePath/ms-contrast/imageName.imageExt
      3. imagePath/imageName.scale-ms-scale.imageExt
      4. imagePath/imageName.imageExt (this is the original URL path)
    6. The ordered list of up to eight candidate URLs is passed into a recursive function: retrieveImage defined in Line 57. This method issues an HTTP or HTTPS GET request to the candidate URLs, replacing the Azure Web Site host name with the Azure blob storage root URL. If the first candidate image isn't found (i.e., the status code is 404), the second is attempted and so on until there's a success code or none of the URLs is found to exist.

      If an image is found (Line 78), a 307 HTTP redirection response is returned, and the Windows 8 app will use the URL supplied in the Location header to directly access blob storage. If the list of candidate URLs is exhausted without finding an image, a 404 response (Line 61) is returned. All other response codes cause the search to terminate and the response payload returned to Windows 8 application, which will likely balk at the response and not show the notification.

    In the worst case success scenario (addImageQuery was specified, but the match was a generic image with no qualifiers), there are eight requests to Windows Azure storage. This comes with both a performance hit and cost implications, since each request to storage is a transaction (billed at $0.01 per 100,000 as of this writing).

    Also note that this code does not result in exactly the same resource lookup behavior you get automatically for images stored in the application package itself. That algorithm has additional nuances and flexibility that would be too expensive (time and transaction-wise) to fully implement as a web service; in fact, even the sample code provided might be more general than your needs, but with the source code in hand you should be able to tailor it fairly easily.

    Creating an Azure Web Site

    Windows Azure Web Sites provide an extremely attractive option (read: free for one year) for hosting web content, and the integration with WebMatrix (as well as TFS and Git) make it extremely simple to use and manage. Because Windows Azure Web Sites is in preview mode as of this writing, you’ll need to make a separate request via the Windows Azure portal to enable that feature before you’ll be able to carry out the following steps.

    Within the Windows Azure portal, simply create a new Azure Web Site, by supplying a unique server name – for this blog post I used win8imageserver – and specifying the Azure data center location at which you want to host the site (I selected East US).

    Creating a new Azure Web Site

    It should take less than a minute to have a new server up and running, at which point you can access it by selecting the new site from the list of provisioned sites:

    Windows Azure Web Sites listed in portal

    There won’t be much going on there yet, which makes sense because you haven’t deployed anything. To do so, you’ll need to download your Web Deploy publish settings and import them into WebMatrix. The publish profile is an XML document that contains specifics about your Azure Web Site, including a hashed password string granting the ability to deploy code to the server. Save those settings to a local file directly from the Azure Web Sites dashboard; however, note that the file does contain sensitive material, should it be protected or even deleted once the settings have been imported, which is the next step.

    Downloading publish profile

    To associate the publish settings with WebMatrix, select the Publish option from the ribbon and import the settings from the file just downloaded from the portal.

    Importing Publish Settings into Web Matrix

    A request to test the server compatibility may be presented, and you can just click through the confirmation screens. When the test has completed, you’ll see a list of the files to be deployed. Just hit the Continue button to deploy them all, although in this case the only one really required is server.js.

    List of files to be deployed to Windows Azure

    At this point, you should have a fully functional service on Windows Azure ready to respond to image requests. Although it’s the Windows 8 notification infrastructure that will make the request, you can spoof a request by just issuing a request in a browser (along with the query parameters) and you should see the redirection to the specific Windows Azure blob storage URL right in the browser.

    Referencing the Server in Windows 8 Notifications

    Now comes the easy part – it’s just different URL that needs to be provided as the notification image source! The format of the request is simply the hostname for the web service followed by the default image path, unadorned by scale, contrast or language elements - the same semantics used to reference images in the application package.

    Here’s some representative code updating the sendLocalImageTile.js file of the App tiles and badges sample:

    // fill in a version of the square template returned by GetTemplateContent
    var squareTileXml = Windows.UI.Notifications.TileUpdateManager.getTemplateContent(
    var squareTileImageAttributes = squareTileXml.getElementsByTagName("image");
    squareTileImageAttributes[0].setAttribute("src", "/images/green.png");
    // include the square template into the notification
    var node = tileXml.importNode(
    squareTileXml.getElementsByTagName("binding").item(0), true); var visual = tileXml.getElementsByTagName("visual").item(0); visual.appendChild(node); visual.setAttribute("addImageQuery", true); visual.setAttribute("baseUri", "http://win8imageserver.azurewebsites.net"); // create the notification from the XML var tileNotification = new Windows.UI.Notifications.TileNotification(tileXml); // send the notification to the app's application tile Windows.UI.Notifications.TileUpdateManager.

    The most significant lines have been highlighted.

    • The image location in the src attribute is now a relative reference, with the addition of the baseUri attribute to the visual element, which points to the new Azure Web Site.
    • Don’t forget to set addImageQuery as well, or the scale, contrast, and language query string parameters will not be sent, and you’ll always get the default image!

    By the way, both baseUri and addImageQuery are available at multiple levels of the template hierarchy, so you can easily mix the sources of images used in the same template. You can also use baseUri with ms-appx:/// and ms-appdata:///local/ to save some typing and to make it more convenient to modify the source of your image files.

    Enhancing the User Experience with the Content Delivery Network

    In both of the scenarios I’ve outlined, the Windows 8 application makes a request to Windows Azure blob storage for a given tile (either explicitly or through a redirect from a custom image server). Those images are all located in the single data center at which you’ve set up your Windows Azure account; in my case, it’s East US. That means a user running my Windows 8 program in Washington D.C. will hit that relatively local Azure data center, but so will a user located in Sydney, Australia, or Johannesburg, South Africa. Latency will of course be much greater for those distant users, and that’s what a content delivery network is designed to mitigate.

    The Windows Azure Content Delivery Network (CDN) is a collection of edge nodes that cache content at various points around the world (24 as of this writing), each serving users who are in geographical proximity of that node. The first request for an image must be served directly from the data center, but then the image can be cached at the edge node for subsequent visitors to retrieve without incurring the latency involved in going completely back to the data center. Accessing the previous Windows Azure portal

    The mechanism works by leveraging a special URL host name (in lieu of *.blob.core.windows.net) which is able to ascertain the closest CDN edge node and make the appropriate request to either that node or directly back to the main data center depending on the state of the cache.

    To obtain that special URL, you’ll need to visit the previous (Silverlight) Windows Azure portal, since at this time the CDN provisioning functionality has not yet been incorporated into the HTML5 version, You can get to the Silverlight portal by clicking the Preview button at the top and selecting Take me to the previous portal.

    In the portal, (1) access to the CDN functionality is via the Hosted Services, Storage Accounts, & CDN option on the left sidebar. CDN (2) is last in the list of subcategories that then appears directly above. That allows access the creation (3) and management of CDN endpoints.

    Steps to create a new CDN endpoint

    When creating a new endpoint, you’re prompted for the Azure storage account to which the CDN applies as well as whether you want to enable HTTPS and query string awareness. I’ve opted into HTTPS capability, which means requests from the client to the CDN endpoint will be secured; however, communication between the CDN edge node and the Windows Azure storage account still occurs via HTTP. The Query String option isn’t relevant here; it’s used to differentiate cached dynamic content served by Windows Azure Cloud (Hosted) Services if you opt to cache such content.

    CDN options

    It can take up to an hour for a new CDN configuration to propagate to all of the nodes, but once done, the CDN endpoint address will display in the portal:

    Properties of a provisioned CDN

    The endpoint assigned in my case is http://az307128.vo.msecnd.net, so I can use that anywhere I would previously have used http://win8apps.blob.core.windows.net, and since I enabled HTTPS access, that scheme will work too.  That means I can use

    tileImageAttributes[0].setAttribute("src", "http://az307128.vo.msecnd.net/images/green.png");

    for a direct reference to Windows Azure storage (keeping in mind it foregoes special handling of the scale, contrast, and language), or I can modify the node.js script (Line 20 in the gist) to

    var AZURE_URI = "az307128.vo.msecnd.net";

    and tap into the CDN with no further changes to the code!

    Weighing the Options

    In this post I covered several options for hosting notification images in the cloud, so which one is right for you? Well, the answer for questions like this is the overused "it depends," but there are some key considerations that may steer you one way or another.

    Is the Cloud for You?

    Hosting the images in the cloud decouples them from your application development and the Windows Store marketplace submission and therefore provides some agility in the development lifecycle and application customization. That comes at a price though - not just the monetary consideration for storage and cloud services, but also in the fact that if the application is not connected when a notification image is requested, the notification will not be served at all. For the best experience, an application would need to include explicit fallback functionality, like delivering a text-only notification in cases where there isn't connectivity. That's more code to write and test.

      The Cloud isn’t Free.

      Cloud services cost money (although you may be able to do quite a bit with the various free allotments available with offers like MSDN and BizSpark).

      • For storing and accessing images from Windows Azure Storage, you’ll accrue a cost for the actual storage (as of this writing it’s at most $0.125 GB/month) and a transaction cost for each request to storage (at the rate of $0.01 per 100,000 requests). Chances are that’s much cheaper than the costs you’d accrue for replicating the functionality on-premises.    
      • If you additionally leverage the CDN, there is a separate schedule of charges, but it adds only a small overhead to the storage charges, with that overhead inversely proportional to the length of time a image can be cached. In other words, CDN becomes more economical the less the data changes and the more it is requested.
      • If you leverage Azure Web Sites you can do so free for a year at this point; subsequent pricing has not yet been announced.
      • Lastly, you can also leverage Windows Azure Cloud Services, namely a Web Role, to provide similar functionality as an Azure Web Site but with additional support for SSL, a Service Level Agreement (SLA), and more enterprise level features and integration points. The charge for Cloud Services varies with the size of the underlying virtual machine (VM) as well as the number of instances that are running. That said, the minimum configuration meeting the SLA requirements would currently cost 4 cents per hour.

      Be Aware of Caching Behavior

      Images for notifications are cached on the Windows 8 client automatically, so changes to the image may not be immediately reflected. The management of the local cache is also somewhat opaque to the developer, with images removed from the local cache when

      • the cache is full,
      • the application is uninstalled,
      • the user clears personal information from all her application tiles (via the Settings on the Start screen).

      If the CDN is being used, images may also be cached at edge nodes for either an explicitly defined or heuristically determined period. If an image is accessed only once in that expiry period, then the benefits of the CDN are nullified, with the added impact of roughly double the storage transaction and bandwidth charges

      You can however exercise some control over the caching behavior: via the BlobProperties.CacheControl property on the blob itself, of if a web service is serving the image content, by setting HTTP response headers to enforce a specific caching policy.

      Read the original blog entry...

      More Stories By Jim O'Neil

      Jim is a Technology Evangelist for Microsoft who covers the Northeast District, namely, New England and upstate New York. He is focused on engaging with the development community in the area through user groups, code camps, BarCamps, Microsoft-sponsored events, etc., and just in general serve as ambassador for Microsoft. Since 2009, Jim has been focusing on software development scenarios using cloud computing and Windows Azure. You can follow Jim on Twitter at @jimoneil

      @ThingsExpo Stories
      "There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
      WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
      "Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
      "IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
      Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
      SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
      SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
      "MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
      Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
      In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
      "Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
      "Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
      It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
      A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
      SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
      Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
      To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
      An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics gr...
      When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things’). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing? IoT is not about the devices, it’s about the data consumed and generated. The devices are tools, mechanisms, conduits. In his session at Internet of Things at Cloud Expo | DXWor...
      Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution. In his session at @ThingsExpo, Akvelon expert and IoT industry leader Sergey Grebnov provided an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.