Welcome!

Eclipse Authors: Liz McMillan, Elizabeth White, XebiaLabs Blog, Ken Fogel, Sematext Blog

Blog Feed Post

Adoption of R by large Enterprise Software Vendors

by Uday Tennety:  Director, Advanced Analytics Services at Revolution Analytics The R ecosystem has become widely popular lately with large players such as Pivotal, Tibco, Oracle, IBM, Teradata and SAP integrating R into their product suites. All these big players are using value chain integration and platform envelopment strategies to build a network effect in order to gain maximum leverage against their competitors in the Big Data and Analytics space. Big Data movement has gained a lot of traction in the enterprise space, and the ecosystem is rapidly evolving. The end goal for most enterprises is not to collect, store and manage data, but to obtain new business insights through predictive modeling and analytics. With this objective in mind, many enterprise software vendors have embraced R for their analytics story. Below is my analysis on the various distributions of R provided by large Enterprise software vendors along with their integration strategies: Oracle R Enterprise Oracle R Distribution is Oracle's free distribution of open source R.  Oracle R Enterprise integrates Oracle R Distribution/ R, the open source scripting language and environment, with Oracle Database. Oracle R Enterprise primarily introduces a variant to many R data types by overloading them in order to integrate Oracle database with R. But, the names of the Oracle R Enterprise data types are the same as the names of corresponding R data types prefixed by "ore". Oracle’s strategy with Oracle’s R Enterprise is to provide in-database analytics capabilities for its widely adopted enterprise RDBMS, and for its Exadata appliance. Tibco’s TERR Tibco with its acquisition of S+ technology from Insightful in 2008, built its own distribution of R called TERR. TERR has been built from ground up in C++, and their team has redesigned data object representation by implementing those objects using abstract C++ classes.  Also, TERR claims to provide better performance and memory management compared to open source R. According to the company sources, TERR is compatible with open source R, and runs analytics by loading data in memory.  But TERR does not yet support data from disk, streaming and database sources, and the company plans to support them sometime in future. Tibco has recently integrated TERR with their data visualization tool, Spotfire, to make it easy for enterprises choosing Spotfire to run an R based analytics tool. PivotalR Pivotal was officially launched on April 1, 2013, as EMC decided to group together a set of EMC, VMware and Pivotal Lab’s products to offer a differentiated Enterprise grade big data platform. Pivotal’s strategy with R is very similar to Oracle’s strategy with their Oracle R Enterprise. PivotalR is a package that enables users of R to interact with the Pivotal (Greenplum) Database as well as Pivotal HD and HAWQ for Big Data analytics. It does so by providing an interface to the operations on tables and views in the database. PivotalR also claims to provide parallel and distributed computation ability on Pivotal for big data analytics. It also provides a wrapper for MADlib, which is an open-source library for parallel and scalable in-database analytics. SAP SAP has integrated R with their in-memory database, HANA, to allow usage of R for specific statistical functions. But, SAP does not ship the R environment with SAP HANA database, nor does it provide support for R. In order to use the SAP HANA integration with R, one needs to download R from CRAN and configure it. Also, an Rserve configuration is needed for this integration to work. SAP’s strategy for integrating HANA with R is to provide a well-known and robust environment for advanced data analysis, while providing a support mechanism in HANA for specific statistical functions. Teradata Teradata has partnered with Revolution Analytics to provide a platform that brings parallelized analytical algorithms to the data. Revolution R Enterprise 7 for Teradata includes a library of Parallel External Memory Algorithms (PEMAs) that run directly in parallel on the Teradata nodes. This strategy provides a scalable solution to run analytical algorithms in-database, in parallel, by bringing analytics to the data in its true sense. IBM Similar to Teradata, IBM has also partnered with Revolution Analytics to provide advanced data analysis capabilities to its PureData System for Analytics platform (fka Netezza).  Revolution R Enterprise for PureData System for Analytics, enables the execution of advanced R computations for rapid analysis of hundreds of petabyte-class data volumes. Today, businesses are scrambling to build IT infrastructure to extract value from all the data available to them. They are afraid that their competitors might get there first and gain a competitive advantage. In short, enterprises are now in a Big Analytics arms race. With strong partners, a powerful community and with a promise of an easy-to-integrate solution, R is in a great position to capitalize on the Big Data and Analytics revolution.      

Read the original blog entry...

More Stories By David Smith

David Smith is Vice President of Marketing and Community at Revolution Analytics. He has a long history with the R and statistics communities. After graduating with a degree in Statistics from the University of Adelaide, South Australia, he spent four years researching statistical methodology at Lancaster University in the United Kingdom, where he also developed a number of packages for the S-PLUS statistical modeling environment. He continued his association with S-PLUS at Insightful (now TIBCO Spotfire) overseeing the product management of S-PLUS and other statistical and data mining products.<

David smith is the co-author (with Bill Venables) of the popular tutorial manual, An Introduction to R, and one of the originating developers of the ESS: Emacs Speaks Statistics project. Today, he leads marketing for REvolution R, supports R communities worldwide, and is responsible for the Revolutions blog. Prior to joining Revolution Analytics, he served as vice president of product management at Zynchros, Inc. Follow him on twitter at @RevoDavid

@ThingsExpo Stories
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, outlined ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and sto...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, discussed the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filterin...
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Early adopters of IoT viewed it mainly as a different term for machine-to-machine connectivity or M2M. This is understandable since a prerequisite for any IoT solution is the ability to collect and aggregate device data, which is most often presented in a dashboard. The problem is that viewing data in a dashboard requires a human to interpret the results and take manual action, which doesn’t scale to the needs of IoT.
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2016 Silicon Valley. The 6thInternet of @ThingsExpo will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today the Enterprise IoT Bootcamp, being held November 1-2, 2016, in conjunction with 19th Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA. Combined with real-world scenarios and use cases, the Enterprise IoT Bootcamp is not just based on presentations but with hands-on demos and detailed walkthroughs. We will introduce you to a variety of real world use cases prototyped using Arduino, Raspberry Pi, BeagleBone, Spark, and Intel Edison. Y...
Much of IT terminology is often misused and misapplied. Modernization and transformation are two such terms. They are often used interchangeably even though they mean different things and have very different connotations. Indeed, it is somewhat safe to assume that in IT any transformative effort is likely to also have a modernizing effect, and thus, we can see these as levels of improvement efforts. However, many businesses are being led to believe if they don’t transform now they risk becoming ...
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...