Cindi Howson, gartner.com
Published: 26 February 2018 ID: G00326555
Analyst(s): Rita L. Sallam, James Laurence Richardson, Joao Tapadinhas, Carlie J. Idoine, Alys Woodward
Modern analytics and business intelligence platforms represent mainstream buying, with deployments increasingly cloud-based. Data and analytics leaders are upgrading traditional solutions as well as expanding portfolios with new vendors as the market innovates on ease of use and augmented analytics.
Strategic Planning Assumptions
By 2020, augmented analytics — a paradigm that includes natural language query and narration, augmented data preparation, automated advanced analytics and visual-based data discovery capabilities — will be a dominant driver of new purchases of business intelligence, analytics and data science and machine learning platforms and of embedded analytics.
By 2020, the number of users of modern business intelligence and analytics platforms that are differentiated by augmented data discovery capabilities will grow at twice the rate — and deliver twice the business value — of those that are not.
By 2020, natural-language generation and artificial intelligence will be a standard feature of 90% of modern business intelligence platforms.
By 2020, 50% of analytical queries will be generated via search, natural-language processing or voice, or will be automatically generated.
By 2020, organizations that offer users access to a curated catalog of internal and external data will derive twice as much business value from analytics investments as those that do not.
Through 2020, the number of citizen data scientists will grow five times faster than the number of expert data scientists.
Visual-based data discovery is a defining feature of the modern analytics and business intelligence (BI) platform. This wave of disruption began in around 2004, and has since transformed the market and new buying trends away from IT-centric system of record (SOR) reporting to business-centric agile analytics with self-service. Modern analytics and BI platforms are characterized by easy-to-use tools that support a full range of analytic workflow capabilities. They do not require significant involvement from IT to predefine data models upfront as a prerequisite to analysis, and in some cases, will automatically generate a reusable data model (see “Technology Insight for Modern Analytics and Business Intelligence Platforms” ). A self-contained in-memory columnar engine facilitates exploration, but also rapid prototyping. Modern analytics and BI platforms may optionally source from traditional IT-modeled data structures to promote governance and reusability across the organization. Many organizations may start their modernization efforts by extending IT-modeled structures in an agile manner and combining them with new and multistructured data sources. Other organizations, meanwhile, may use the analytic engine within the modern analytics and BI platform as an alternative to a traditional data warehouse. This approach is usually only appropriate for small or midsize organizations with relatively clean data from a limited number of source systems. The rise in the use of data lakes and the logical data warehouse also dovetail with the capabilities of a modern analytics and BI platform that can ingest these less-modeled data sources (see “Derive Value From Data Lakes Using Analytics Design Patterns” ).
Gartner redesigned the Magic Quadrant for BI and analytics platforms in 2016, to reflect this more than decade-long shift. The multiyear transition to modern agile and business-led analytics is now mainstream, with double-digit growth; meanwhile, spending for traditional BI has been declining since 2015, when Gartner first split these two market segments. Initially, much of the growth in the modern analytics and BI market was driven by business users, often through small purchases made by individuals or within business units. As this market has matured, however, IT has increasingly been driving (with the influence of business users) the expansion of these deployments as a way of broadening the reach of self-service analytics, but in scalable way.
The crowded analytics and BI market includes everything from longtime, large technology players to startups backed by enormous amounts of venture capital. Vendors of traditional BI platforms have evolved their capabilities to include modern, visual-based data discovery that also includes governance. Newer vendors, meanwhile, continue to evolve the capabilities that were once focused primarily on agility, extending them to greater governance and scalability, as well as publishing and sharing. The ideal for customers is to have both Mode 1 and Mode 2 capabilities (see Note 1) in a single platform, with interoperability and promotion between the two modes.
As disruptive as visual-based data discovery has been to traditional BI, a third wave of disruption has already begun in the form of augmented analytics, with machine learning generating insights on increasingly vast amounts of data. Vendors that have augmented analytics as a differentiator are better able to command premium pricing for their products (see “Augmented Analytics is the Future of Data and Analytics” ).
This Magic Quadrant focuses on products that meet the criteria of a modern analytics and BI platform (see “Technology Insight for Modern Analytics and Business Intelligence Platforms” ), which are driving the majority of net new mainstream purchases in today’s market. Products that do not meet the modern criteria required for inclusion here (because of the upfront requirements for IT to predefine data models, or because they are reporting-centric) are covered in our Market Guide for traditional enterprise reporting platforms.
Magic Quadrant customer reference survey composite success measures are cited throughout the report. Reference customer survey participants scored vendors on each metric; these are defined in Note 2.
The Five Use Cases and 15 Critical Capabilities of an Analytics and BI Platform
We define and assess 15 product capabilities across five main use cases, as outlined below:
- Agile Centralized BI Provisioning — Supports an agile IT-enabled workflow, from data to centrally delivered and managed analytic content, using the self-contained data management capabilities of the platform.
- Decentralized Analytics — Supports a workflow from data to self-service analytics. Includes analytics for individual business units and users.
- Governed Data Discovery — Supports a workflow from data to self-service analytics to SOR, IT-managed content with governance, reusability and promotability of user-generated content to certified data and analytics content.
- OEM or Embedded BI — Supports a workflow from data to embedded BI content in a process or application.
- Extranet Deployment — Supports a workflow similar to agile centralized BI provisioning for the external customer or, in the public sector, citizen access to analytic content.
Vendors are assessed according to the 15 critical capabilities listed below. Changes, additions and deletions from last year’s critical capabilities are listed in Note 3. Subcriteria for each capability are included in a published RFP document (see “Toolkit: BI and Analytics Platform RFP” ). How well the platforms of our Magic Quadrant vendors support these critical capabilities is explored in greater detail in “Critical Capabilities for Business Intelligence and Analytics Platforms.”
- BI Platform Administration, Security and Architecture. Capabilities that enable platform security, administering users, auditing platform access and utilization, and ensuring high availability and disaster recovery.
- Cloud BI. Platform-as-a-service and analytic-application-as-a-service capabilities for building, deploying and managing analytics and analytic applications in the cloud, based on data both in the cloud and on-premises.
- Data Source Connectivity and Ingestion. Capabilities that allow users to connect to structured and unstructured data contained within various types of storage platforms (relational and nonrelational), both on-premises and in the cloud.
- Metadata Management. Tools for enabling users to leverage a common semantic model and metadata. These should provide a robust and centralized way for administrators to search, capture, store, reuse and publish metadata objects such as dimensions, hierarchies, measures, performance metrics/key performance indicators (KPIs), also report layout objects, parameters and so on. Administrators should have the ability to promote a business-user-defined data mashup and metadata to the SOR metadata.
- Self-Contained Extraction, Transformation and Loading (ETL) and Data Storage. Platform capabilities for accessing, integrating, transforming and loading data into a self-contained performance engine, with the ability to index data and manage data loads and refresh scheduling.
- Self-Service Data Preparation. “Drag and drop” user-driven data combination of different sources, and the creation of analytic models such as user-defined measures, sets, groups and hierarchies. Advanced capabilities include machine-learning-enabled semantic autodiscovery, intelligent joins, intelligent profiling, hierarchy generation, data lineage and data blending on varied data sources, including multistructured data.
- Scalability and Data Model Complexity. The degree to which the in-memory engine or in-database architecture handles high volumes of data, complex data models, performance optimization and large user deployments.
Analysis and Content Creation
- Advanced Analytics for Citizen Data Scientist. Enables users to easily access advanced analytics capabilities that are self-contained within the platform itself through menu-driven options or through the import and integration of externally developed models.
- Analytic Dashboards. The ability to create highly interactive dashboards and content with visual exploration and embedded advanced and geospatial analytics to be consumed by others.
- Interactive Visual Exploration. Enables the exploration of data via an array of visualization options that go beyond those of basic pie, bar and line charts to include heat and tree maps, geographic maps, scatter plots and other special-purpose visuals. These tools enable users to analyze and manipulate the data by interacting directly with a visual representation of it to display as percentages, bins and groups.
- Augmented Data Discovery. Automatically finds, visualizes and narrates important findings such as correlations, exceptions, clusters, links and predictions in data that are relevant to users without requiring them to build models or write algorithms. Users explore data via visualizations, natural-language-generated narration, search and natural-language query (NLQ) technologies.
- Mobile Exploration and Authoring. Enables organizations to develop and deliver content to mobile devices in a publishing and/or interactive mode, and takes advantage of mobile devices’ native capabilities, such as touchscreen, camera and location awareness.
Sharing of Findings
- Embedding Analytic Content. Capabilities including a software developer’s kit with APIs and support for open standards for creating and modifying analytic content, visualizations and applications, embedding them into a business process and/or an application or portal. These capabilities can reside outside the application, reusing the analytic infrastructure, but must be easily and seamlessly accessible from inside the application without forcing users to switch between systems. The capabilities for integrating analytics and BI with the application architecture will enable users to choose where in the business process the analytics should be embedded.
- Publish, Share and Collaborate on Analytic Content. Capabilities that allow users to publish, deploy and operationalize analytic content through various output types and distribution methods, with support for content search, scheduling and alerts. These capabilities enable users to share, discuss and track information, analysis, analytic content and decisions via discussion threads, chat and annotations.
Overall platform capabilities were also assessed:
- Ease of Use, Visual Appeal and Workflow Integration. Ease of use to administer and deploy the platform, create content, consume and interact with content, as well as the degree to which the product is visually appealing. This capability also considers the degree to which capabilities are offered in a single, seamless product and workflow, or across multiple products with little integration.
Figure 1. Magic Quadrant for Analytics and Business Intelligence Platforms
Source: Gartner (February 2018)
Vendor Strengths and Cautions
Birst provides an end-to-end cloud platform for analytics and BI and data management on a multitenant architecture. The Birst platform can be deployed on a public or private cloud, or on an appliance (Birst Virtual Appliance). The platform allows customers to choose their own database for the analytics schema, such as SAP HANA, Amazon Redshift, Exasol, and SQL Server.
In 2Q17, Birst was acquired by private company Infor. Birst is now an independent business unit of Infor, reporting directly to Infor’s CEO. Infor is based on a number of diverse software company acquisitions across a wide range of application software markets. Five years ago, Infor changed its strategy to focus on providing cloud offerings and niche applications in support of microvertical markets. The ramp-up in cloud demonstrates the reason for Infor choosing Birst as a horizontal data and analytics layer to support its other solutions. Likewise, the microvertical application focus shows the range of opportunities for embedded BI and add-on sales that the Infor acquisition should make available to Birst.
Birst’s product and positioning are centered on its concept of “networked analytics.” Its ability to connect centralized and decentralized groups via a network of virtual instances that share a common and reusable set of business rules and definitions is attractive to organizations wanting to offer self-service in a managed environment. Birst achieves networked analytics via its semantic layer, which can be generated automatically by end users without intervention from IT.
Birst is positioned in the Niche Players quadrant, with improvement in Ability to Execute since last year. Birst scores well for its product capabilities, but is limited by lower scores from customer references on operations and customer experience. Its Completeness of Vision is enhanced by some elements of a visionary product roadmap, but there are shortcomings — particularly in market understanding and marketing strategy.
- Frequently the enterprise standard: Three quarters of Birst’s reference customers said it was their only enterprise standard for analytics and BI. This puts Birst in the top quartile of vendors in this Magic Quadrant for this attribute and is an improvement on last year. Birst’s platform strength in enabling a wide range of analytical needs across both centralized and decentralized analytics makes this status a natural outcome. Additionally, 98% of Birst reference customers expect to continue using the product. Birst has the ability to support both Mode 1 and Mode 2 usage styles in a single platform — supporting everything from data preparation to dashboards to scheduled, formatted report distribution. Because of its broad product functionality, Birst gets some of the highest scores among the vendors in this Magic Quadrant for its product capabilities across the range of all five use cases.
- Cloud and low total cost of ownership: Cloud deployment was the top reason cited for customers choosing Birst, and total cost of ownership (TCO) the second most popular reason. Birst was also in the bottom quartile of vendors for the number of reference customers who said the cost of the technology was a reason to limit further deployment. As a multitenant cloud-provider, Birst has long had subscription-based pricing that can lower the entry point for customers in comparison with traditional perpetual licensing. When looking at limits to further deployment, Birst had the highest proportion of reference customers (66%) who said there were no limits to further deployment within their company. Birst’s existing customers find the platform matches well with their needs.
- Wide range of data sources analyzed: Modern analytics and BI buyers are increasingly faced with the challenge of data being distributed in multiple and varied data stores. Birst allows customers to access and merge multiple data sources while still providing governance. Reference customers access an average of nine data sources with Birst, which is above average for this Magic Quadrant. Birst customers tend to analyze a range of cloud and on-premises data sources. Birst’s reference customers are using a mix of ERP applications (such as those from SAP, Oracle, Infor and Microsoft) well as custom applications. On the other hand, Salesforce CRM is a popular data source for Birst customers, showing the relationship in data gravity for source systems and analytics and BI. The platform plays a role in helping customers move to the cloud — more than half of Birst’s customers reported they were analyzing on-premises data that is moved to the cloud for analysis and replicated in Birst’s own data store. Thirty-four percent were analyzing cloud data in Birst, while 12% were analyzing data that remained on-premises, using hybrid connectivity.
- Sales and vertical strategy: With its acquisition of Birst, Infor now has more than 1,000 direct sales reps that can sell Birst; a dramatic increase from only about 70 direct sales reps previously. Birst customers report an overall positive sales experience, which is above average for this Magic Quadrant. Customers are looking for faster time to value, and the use of prebuilt industry and vertical content is an important buying criteria. Here, Birst has a number of prebuilt solution accelerators, with roughly a quarter of its customers using them. Infor expects to further expand on this content, converting what had been traditional BI capabilities to the Birst platform.
- Market understanding and more narrow usage: Birst is below average in market understanding, a composite measure that includes complexity of analysis, complexity of data supported and ease of use. A large portion of Birst’s reference customers are using the product primarily for parameterized dashboards and reports, with only small portions using the product for more sophisticated analytic tasks; the product does support complex data models. Birst was evaluated as being below average for overall ease of use, but is particularly low (in the bottom quartile) for ease of content development. Sixty-one percent of its reference customers evaluated the product’s overall ease of use as excellent, but, in a market in which ease of use is often the most important buying criteria, this is not enough.
- Customer experience: Birst is in the bottom quartile of vendors for customer experience, driven in particular by low scores from its reference customers for user enablement (apart from the user community) and the availability of skilled resources from the vendor. Achievement of business benefits is one aspect of customer experience in which Birst was below average. It was in the bottom quartile of vendors for customers achieving hard business benefits overall, with low scores in monetizing data, increasing revenue from BI, and reducing non-IT costs, in particular.
- Operations: Operations includes items related to support, product quality and migration experience, for which reference customer scores placed it as below average. Scores for overall support placed it in the bottom quartile. Customers are less satisfied with support this year than in the previous year. This may suggest that the acquisition has been causing some disruption, but relatively low reference scores for operations has been a recurring situation for Birst.
- Gaps in product roadmap: Birst was awarded two patents in 2017, relating to augmented data preparation and augmented data discovery. The product has long included the ability to automatically generate dashboards, but the next wave of disruption is about the significance of the insights generated — where Birst currently lags behind other vendors. Gaps in content analytics, real-time streaming analytics and a marketplace are all limitations in this vendor’s product vision.
BOARD delivers a single, integrated system that provides BI, analytics and financial planning and analysis (FP&A) capabilities in a hybrid in-memory, self-contained platform. The company’s stated aim is to provide an “end-to-end decision-making platform.” Since late 2016, BOARD has been focusing more on the U.S. market, where it has seen significant growth in its customer base.
For BOARD, 2017 was a year of retargeting the sales efforts of the company away from small or midsize businesses (SMBs) toward larger enterprises, while also successfully transitioning to a cloud and subscription-based model. Cloud now generates more than twice as much revenue as on-premises software sales. BOARD grew its total company revenue by more than 45% during the past year.
BOARD is a Niche Player in this Magic Quadrant. It successfully serves a submarket for centralized, single-instance BI, analytics and corporate performance management (CPM) platforms. BOARD has a narrow focus and limited market awareness, but growing regional adoption. BOARD is well-positioned in this unified BI/CPM submarket and offers its platform on-premises as well as in the cloud.
- Unified analytics, BI and CPM platform: BOARD is one of only two vendors here offering a modern analytics and BI platform with integrated financial planning and reporting functionality (the other being SAP, with Analytics for Cloud). As such, it is highly differentiated for the relatively small volume of buyers looking to close the loop between BI and financial processes. Although this “analyze, simulate, plan” integration is its key differentiator, BOARD did provide evidence that it is also both suitable and selected for read-only analytics and BI deployments, although in Gartner’s view this is less commonly the case.
- Centralized and decentralized use cases: According to its reference customers, the two main use cases for BOARD are now agile centralized BI provisioning and decentralized analytics. BOARD is now deployed more by its BI customers in a decentralized mode than in a centralized one. This is a change over prior years, and reflects the capabilities of BOARD’s single-instance platform and its proprietary hybrid in-memory capabilities for meeting self-service analytics needs.
- Broad usage and analytic complexity: BOARD again achieved one of the top reference scores in terms of breadth of use. This looks at the percentage of customers using the product across a range of BI styles — from viewing reports, creating personalized dashboards and doing simple ad hoc analysis, to performing complex queries, data preparation and using predictive models. BOARD’s reference customers also placed it in the top quartile for the complexity of analysis performed with its platform. This change reflects an evolution of the reported usage of BOARD away from simple reporting and dashboarding toward interactive visualization and data discovery. Note, however, the caution below regarding data volumes analyzed.
- Customer perception: BOARD’s reference customers seem satisfied and loyal, with almost no clients indicating that they plan to discontinue their use of its product. In addition, no really significant limitations to its wider use were expressed. Further, when asked to evaluate the success of BOARD in their organization, customer references placed it in the top quartile of vendors for this Magic Quadrant. Critical, and very positive in an increasingly competitive market for BOARD, is the improvement in its reference customers’ views of the company’s viability as a supplier.
- Used on small, more traditional datasets: BOARD reference customers reported using the lowest overall data volumes of any vendor, and all input data used came from relational sources, with no Hadoop on NoSQL data sources being used by these customers. BOARD’s core cube architecture — based on multidimensional online analytical processing (MOLAP) or relational online analytical processing (ROLAP) — can become a limiting factor. This is especially true for clients that need to access and analyze diverse data sources, and for clients who wish to perform complex types of analysis on diverse data sources.
- Keeping user enablement in focus: In 2017, BOARD launched new initiatives to improve customer enablement, including an online user community and its first-ever user conference with 500 people attending. However, these have yet to influence its customer reference scores, with BOARD still in the bottom quartile for documentation and online tutorials, and its user community. BOARD continues to work on improving enablement in order to satisfy a growing global customer base. The strength of a product’s user community is an increasing driver of selection.
- Product vision: BOARD lacks a compelling vision for the future of its platform. Release 10.2 added entry-level natural-language processing (NLP), search-generated analytics and automatic narrative creation, but these were reactive to trends set by other vendors. Its future roadmap for BOARD 11 is dominated by re-engineering its core calculation and storage engine, to take advantage of parallelization and concurrency in order to speed data loading and performance. While this is required, it’s not really the type of externally facing visionary technology plan that will compel people to get excited about BOARD, or to consider it for evaluation.
- Marketing awareness: BOARD has improved its sales approach and is reaping the results of its move to the cloud deployment in terms of net new customers gained. However, the perception of it as an option lags behind the rest of the market. Based on Gartner interactions with evaluators, very few organizations longlist BOARD for consideration. It remains a little-known brand and needs to invest in raising its profile in the minds of potential buyers. The main plank of BOARD’s go-to-market plan is to “become the natural choice to modernize traditional BI platforms,” but to deliver on this mission it needs to be more widely known.
Domo is a cloud-based analytics and BI platform aimed at senior executives and line-of-business users who need intuitive business-facing dashboards. Domo is often deployed in the line of business with little or no support from IT. A higher percentage of its customer references report using it primarily for decentralized use cases than almost all of the other vendors in this Magic Quadrant.
During the past year, Domo added further support for embedded or OEM use cases (with Domo Embed and Domo Everywhere), extended its prepackaged dashboard content (Domo Business-in-a-Box), and added functionality for filtering and sorting data visually in Domo Analyzer. It also announced its first foray into applying machine learning, Mr. Roboto, which will initially be applied to data integration within Domo’s “Magic” ETL module. Domo also extended its freemium offering to a year for up to five users and 5 million rows of data.
Domo’s marketing and sales efforts are funded by a large venture capital reserve (now totaling $690 million) and have resulted in a high level of awareness. It has been able to expand its customer base in an increasingly crowded and price-sensitive market, with positive execution on many measures related to customer experience. Domo’s easy-to-use platform has translated into good market understanding. As in 2017, however, its limited geographic presence and a product vision that remains focused more on closing gaps with the current Leaders than on disruptive innovation that competitors will emulate, place it in the Niche Players quadrant.
- Ease of use: Domo is in the top quartile of the vendors in this Magic Quadrant for its ease-of-use scores. It was evaluated as being top of all vendors in this Magic Quadrant in four out of five ease-of-use categories by its reference customers, including for its visual appeal. Reflecting this strength, Domo was primarily selected by its reference customers for its ease-of-use capabilities, followed by data access and integration, and its cloud deployment model.
- Rapid deployment of management-style dashboards: Domo is well-suited to rapid deployment of intuitive management-style dashboards. Its native cloud approach, plus an extensive range of prebuilt connectors to cloud-based data sources and applications, feeds Domo Apps (both free and premium), which are out-of-the-box content packs with KPIs and dashboards. According to Domo, almost 90% of its customers use prebuilt content. From discussion with users it is evident that Domo’s ability to connect to enterprise applications is a differentiator — in that Domo maintains the connectors in the form of API-like connectors that can respond dynamically to changes in source-side schemas.
- Business outcomes: Domo’s reference customers report that they are achieving business benefits using its platform, placing it in the top quartile in both the soft and hard business benefit categories measured. Domo’s scores also led to top quartile placement for product quality and migration experience, and while one might expect such high scores for migration experience for all cloud vendors, this is often not the case. Domo’s reference customers are also strongly positive about its viability and future — highest of all vendors — and also report greater success with the product this year compared with last year.
- Sales experience: Based on the views of its reference customers, Domo provides an excellent customer experience overall. References scored the Domo sales experience higher than that of any other vendor. Postsales, a key part of ongoing customer experience is the enablement capabilities available. Here, Domo also scored highly — in the top quartile for the availability of skills in the market and from the vendor, and for training, documentation and tutorials. In addition, Domo’s customer relationship managers provide a high-touch approach.
- Complexity of analysis: Domo’s key strength and most-used capability is management dashboards, which is what 68% of users use the platform for according to its reference customers. While Domo has improved its interactive data discovery and visualization functionality with Domo Analyzer, relatively few users (17%) are using the product for interactive visual exploration. This finding is reflected in Domo’s below-average score for complexity of analysis, a key element of market understanding. The challenge for Domo is to move the profile of its customer base from one mostly focused on fixing the gaps in traditional KPI-based dashboarding to more complex forms of analysis.
- Low standardization rate: As previously, few of Domo’s reference customers consider it to be their enterprise analytics and BI platform standard. The only vendor to score lower here is much younger than Domo. Given the very strong value its customers evidently get from using the Domo platform, this score seems surprising. Yet, Domo is often deployed by the line of business — in isolation from domain-specific business analysis such as marketing, finance and supply chain. As long as the cost-benefit ratio is in balance, being the “standard” may not matter, particularly if the land-and-expand sales model is functioning well. There is evidence in the survey that Domo’s freemium model is delivering more “lands,” in that this year’s average user deployment is at a lower level. Moving customers from initial installs to more far-reaching deployments that may grow into a standard over time should be an area of focus.
- Cost as limitation to broader deployment: More than one-third of Domo’s reference customers cited cost as a limitation to its broader deployment — despite the potential cost benefits from cloud deployment. Pricing pressure has become more acute, with more competition and large players introducing dramatically lower cost offerings. This dynamic is likely to increase, because the market has gone mainstream and price/value becomes a more important buying criterion for large enterprise deployments.
- Support issues: The sole operational factor where Domo customers evaluated it as being below average was support. Reference customers continue to cite this as one of the limitations to its broader deployment. These customers noted issues with response time and time to resolution; support growing pains are not uncommon in a rapidly growing organization such as Domo.
IBM is represented with two products this year: IBM Cognos Analytics and IBM Watson Analytics. Cognos Analytics version 11 and higher represents the rebranding of the Cognos Business Intelligence product (version 10.2.2 and earlier), to signify combining production reporting capabilities with self-service dashboards and ad hoc analysis all within one modern analytics and BI platform. An improved user experience and the initial inclusion of augmented capabilities, specifically search, within Cognos Analytics makes it an easier-to-use and more visually appealing platform. It is available both on-premises or as a hosted solution on the IBM cloud. Watson Analytics also provides augmented analytic capabilities, including automated pattern detection, support for NLQ generation and embedded advanced analytics via a cloud-only solution.
The redesigned Cognos Analytics platform, which combines IT-authored content with content authored by business users, has been slow to gain traction. The availability of a second, separate analytics and BI platform in Watson Analytics adds confusion, from the perspective of differentiating both between the two IBM offerings and between two similarly named but unrelated Watson products.
IBM remains positioned in the Visionaries quadrant this year, but has lost some traction relative to other vendors in the market. It was hindered by low overall product scores as well as those for sales execution and customer experience. While IBM’s product roadmap has many visionary elements, its market understanding in terms of complexity of analysis lowers its overall position.
- Augmented and easy to use: IBM has been a leader in incorporating augmented analytics or “smart” capabilities into its products (first with Watson Analytics and increasingly with Cognos Analytics) to simplify data preparation, create basic visualizations and perform more advanced predictive analyses. These capabilities drive ease of use and enable business users and citizen data scientists to leverage analytic capabilities.
- All-in-one platform: The ability to create both Mode 1 and Mode 2 analytic content in the Cognos Analytics platform is a unique differentiator for IBM in a market that often dictates the requirement of separate tools for standard reporting and more visual/exploratory analysis. IBM’s reference customers choose Cognos Analytics mainly for its superior functionality. The modern platform includes scheduling and alerting, features often historically lacking in modern platforms. Customers can also leverage existing Cognos Framework Manager models as a source for new dashboards and explorations, thus bridging reusability, governance and agility.
- Global, socially responsible vendor: IBM has a broad global presence and an ability to support customers in all geographies, which is consistent with many megavendors but in contrast to almost half the vendors in this Magic Quadrant. It also exemplifies attentiveness to and active participation in social initiatives. For example, the provisioning of free or reduced-cost software to academia and the institution of a corporate policy on environmental affairs. Also, its Science for Social Good initiative, which partners IBM Research scientists and engineers with academia and subject matter experts to tackle societal challenges — using science and technology and the designation of a chief diversity officer.
- Large incumbent user base: IBM has a strong traditional BI user base, many of whom are investigating more modern options and are receptive to continuing to leverage the IBM platform if the product offering is sound. Existing Cognos BI customers can upgrade to Cognos Analytics as part of maintenance, so license cost is not a barrier to adoption. Watson Analytics, on the other hand, is a per-user license, but available via digital sign-up. The pricing for its Plus ($30 per user per month) or Professional ($80 per user per month) offerings is lower than for other products with robust augmented analytics capabilities, and allows for an easy way to investigate and trial. Reference customers for IBM cite low license cost as the top reason for buying Watson Analytics.
- Promising revamp still slow to gain traction: Cognos Analytics offers an interesting and innovative proposition but, based on Gartner customer inquiries, has been slow to gain traction in terms of both new customers and upgrades from existing customers. Many Cognos Analytics customers have not yet upgraded to the redesigned version and continue to investigate additional external options for modernizing. IBM’s bifurcated product strategy continues to cause confusion and concern regarding its commitment to providing a fully comprehensive and cohesive analytics and BI platform.
- Limited complexity of analysis: In terms of market understanding, above-average ease-of-use scores from IBM customer references are marginalized by the lowest scores for complexity of analysis — placing IBM in the lowest quartile for this measure. Fifty-four percent of the reference customers indicated that they use the platform primarily for parameterized dashboards, and use it less often for more sophisticated tasks such as self-service data preparation or interactive visual exploration.
- Gaps in product capabilities: Even though reference customers for IBM cite superior functionality as a reason for buying Cognos Analytics, IBM’s overall product scores (for both products) were the lowest of any vendor in this Magic Quadrant. Twenty percent of reference customers indicated that weak or lacking product functionality is a problem with the platform, compared with an average of 10% for the other vendors. Both products have limitations in terms of mobile capabilities, as well as scalability and model complexity. Each product has its own gaps. For example, Watson Analytics is less mature than Cognos Analytics regarding administration and security, but introduces more complex analysis with advanced pattern detection; Cognos Analytics is better for metadata management than Watson Analytics.
- Lower customer experience, sales experience and operations: After an improvement last year, customer reference scores for customer experience and sales experience were lower this year. IBM’s customer experience scored in the bottom quartile of vendors in this Magic Quadrant for overall business benefit and the below-average availability of its skilled resources. Sales experience decreased for Cognos Analytics in particular, placing IBM in the bottom quartile of vendors, despite 58% of reference customers evaluating their customer experience as “excellent.” IBM’s reference scores for operations also placed it in the bottom quartile. This evaluation was driven by its being in the bottom quartile for overall support and product quality — which could be attributed to the breadth of the solution and the associated complexity of upgrades.
Information Builders sells multiple components of its integrated WebFOCUS analytics and BI platform. For this Magic Quadrant, Gartner has evaluated InfoAssist+, which comprises a number of components from the WebFOCUS stack, as the foundation of its modern analytics and BI offering. While Information Builders is known for delivering analytic applications to large numbers of mainstream users in more operational or customer-facing roles, with WebFOCUS, InfoAssist+ is intended to satisfy modern, self-service analytics and BI needs.
In May 2017, Information Builders announced that Goldman Sachs’ Private Capital Investing group had made a growth equity investment in the company. As a result, the firm has new strategic advisors and equity to invest in new programs — initially in its go-to-market approach around prepackaged analytic apps. From a product perspective, InfoAssist+ 18.104.22.168 (released in October 2017) strengthened the NLQ and search functionality for both metadata and existing content, in order to deliver recommendations based on machine-learned patterns to the user. More importantly, the latest version is intended to improve the overall user friendliness of InfoAssist+ via user experience (UX) improvements aimed at enabling business developers. These capabilities were too new to be widely used by the customers surveyed for this report. The vendor added natural-language generation (NLG) to its offering via an OEM.
Information Builders is positioned in the Niche Players quadrant. InfoAssist+ still has little visibility or momentum in the market outside Information Builders’ own installed base, and is not being evaluated in many competitive sales cycles.
- Traditional to modern reach: InfoAssist+ is a combination of visual data discovery, reporting, rapid dashboard creation, interactive publishing, mobile content and the Hyperstage in-memory engine. Users can create their own analytic content and promote it as InfoApps (interactive, analytical apps for nontechnical users) on the WebFOCUS Server with scalable distribution. InfoAssist+ can also be completely decoupled from the WebFOCUS Server, enabling easier implementation. This blend of capabilities means that InfoAssist+ is deployed for a wide range of use cases: most frequently for decentralized analytics, followed by agile centralized BI and traditional IT-centric reporting, according to its reference customers. The shift to decentralization marks a change in how InfoAssist+ is being used, and better alignment with key modern buying criteria.
- Core functional capabilities: InfoAssist+ has strong functionality in analytics and BI platform administration, security and architecture, mobile exploration and authoring, embedded analytic content, data source connectivity and ingestion, and self-contained ETL and data storage capabilities. The primary reasons customer cite for selecting InfoAssist+ remain data access and integration.
- Customer support: The support services offered were given positive scores by the InfoAssist+ reference customers, across all three areas: level of expertise, responsiveness and time to resolution. In fact, Information Builders has often differentiated itself by attempting to provide best-in-class customer service, wanting to be customers’ partner for the long term. The vendor’s reference scores for ethics, culture and diversity placed it in the top quartile of vendors for this Magic Quadrant, with 85% describing this as “excellent.” Work has gone into strengthening the availability of skilled resources in the market; a weakness last year, this has now been turned around. Information Builders has signed more than 40 new partners into a new reseller program specifically for InfoAssist+.
- Prepackaged analytic apps: A long-standing area of strength for Information Builders, these provide prebuilt assets and customizable data models designed for a variety of vertical and horizontal areas. More than 30% of this vendor’s customer base use these offerings, which now cover banking, healthcare, insurance, law enforcement, visual warehouse/facilities management and retail.
- Lack of market momentum: Based on new customer acquisition, searches and inquiries, Information Builders has not generated an overwhelming amount of interest — especially for a company trying to position InfoAssist+ as a modern analytics and BI platform. The InfoAssist+ offering is primarily sold into its existing WebFOCUS Server installed base, as part of its traditional information application core business. It is not typically sold stand-alone. That said, with the move toward decentralization evident in our reference customer survey sample, InfoAssist+ deployments are growing (averaging 718 users in the sample for this Magic Quadrant).
- Ease of use: Ease of use remains considerably below average for Information Builders, which was placed in the bottom quartile for this overall. Although now being used more for decentralized use cases, the UX profile of InfoAssist+ does not match this requirement well. The only area of ease of use where reference customers scored it above average was in administration and deployment, which are IT tasks. In the more self-service forms of ease of use (content development and content consumption), it had below-average scores from its reference customers, and in the area of visual appeal it was placed lowest of all products in this Magic Quadrant. As in 2017, customers using InfoAssist+ reported difficulty of use as a problem more than for any other product in this Magic Quadrant. Further, reference scores for ease of use for business users placed it in the top quartile in terms of reasons limiting its wider deployment in customer organizations. The new 22.214.171.124 release is beginning to address these issues.
- Areas of limited functionality: Although strong in a number of functional areas, InfoAssist+ currently has limited functionality in some more modern areas, specifically self-service data preparation and augmented analytics categories. These areas are, however, on the product roadmap.
- Narrow usage and business outcomes: According to its reference customers, InfoAssist+ is used mostly for parameterized reports and dashboards and simple ad hoc analysis. It is used less often than the survey average for interactive data discovery and visualization, perhaps as a result of issues with user friendliness. As such, its breadth of use — particularly as it applies to the defining modern interaction model — is an issue. This narrow applicability impacts its reference scores for business benefit, where it is placed as below average in the majority of benefit categories, and in the bottom quartile for “expanding types of analysis” as an outcome gained from its use.
Logi Analytics (“Logi”) is best known for its ability to embed analytic content in websites and applications, and to enable end-user organizations to extend their BI access externally to customers, partners and suppliers. Its Logi Platform is composed of Logi Info and DataHub. Logi Info itself is composed of two parts: a set of capabilities for software product managers and developers to build embeddable analytic apps; and a self-service module for end users to create and interact with dashboards and data visualization. Logi’s DataHub is a data preparation and columnar data store that enables user to ingest, blend and enrich data from multiple sources.
For Logi, 2017 has been a year of significant change. First, the company went through a major positioning and packaging overhaul in order to focus exclusively on the OEM and embedded use case for analytics and BI platforms. Secondly, Logi Analytics was acquired by Marlin Equity Partners, a global private equity firm that specializes in growing software businesses, in October 2017. Logi will remain a privately held company, but will now be able to leverage the expertise and operational resources of its new owner.
Logi Analytics is positioned in the Niche Players quadrant, reflecting how the Logi Platform is used by the majority of its customers. Its increased focus on its long-standing core strength in embedded analytics for developers is a de facto “niche” approach — serving one use case. The changes in messaging associated with this strategy seem to be causing turbulence among its reference customers, who have, on average, been using the platform for four years. Logi Analytics’ reference customer scores placed it as significantly below average in a number of factors (including customer experience, vendor viability, breadth of use and complexity of analysis undertaken). As Logi had fewer than 25 reference customers responding to the Magic Quadrant reference survey, comments should be taken as directional. These reference survey results are further compounded by relative weakness in marketing strategy, geographic strategy and verticalization. Logi is in a period of transition. The company has an evident specialism in embedding modern analytics and BI within other apps and business processes, and is well-aligned with that use case. It is less well-aligned with the wider definition used for evaluation in this Magic Quadrant.
- OEM/embedded use-case specialism: Logi Platform is deployed in an embedded use case by more of its customers than all but one of the vendors in this Magic Quadrant. It continues to score highly in the embedded use case from a product perspective. Logi now leads with the concept of “developer-grade analytics,” and has made significant investment in how it goes to market in order to ensure its offering aligns with product management and developer buyer personas. The early signs are that this strategy is working, with Logi achieving profitability, increased product revenue and growth, and improved sales and marketing efficiency.
- Functional strength: Based on reference customer feedback and Gartner’s evaluation of the technology, Logi Platform offers “excellent” to “outstanding” functionality in BI platform administration, security and architecture, data source connectivity and ingestion, interactive visual exploration, analytic dashboards and, as would be expected, in the embedding analytic content capability. In particular, functions such as rapid security integration and real-time database write-back make it very attractive for both OEMs and large enterprise customers that want to embed modern self-service BI in other apps.
- Cost-effectiveness: According to its reference customers, the joint-top reasons for selecting Logi Platform are license cost and ease of use for content developers/authors. This finding is reinforced when it comes to seeing the cost of software as a barrier to wider deployment; Logi’s reference customers are among the least concerned of all about this issue. In part, this lack of concern can be attributed to the product’s attractive and flexible core-based pricing model.
- OEM customer-oriented predictive analytics: Logi’s product vision for enabling the embedded use of analytics goes beyond the natural tendency to meet only descriptive (reporting) and diagnostics (interactive visualization) requirements in embedded apps. The intent to provide sophisticated embeddable analytics is evident in how it is targeting the citizen data scientists that its partners are serving — via functionality to autogenerate models and embed predictive analytics in OEM applications. Significantly, this can allow the end customer of the OEM to retrain predictive models on their data in a self-service mode.
- Simple usage predominates: Consistent with last year, Logi is in the bottom quartile for complexity of analysis. Logi’s reference customers’ scores placed it bottom overall for breadth of use. Overall, Logi reference customers reported that Logi Platform was used the least of all products in this Magic Quadrant for interactive data discovery or visualization tasks by its users. The bulk of users use Logi Info to consume parameterized reports and dashboards.
- Concerns over market relevance and customer retention: Logi’s viability to continue serving the analytics market was considered to be among the weakest of all vendors in this research by its reference customers. It was the only vendor where customers reported a lower viability rating than in the prior year. Gartner’s definition of viability means a vendor’s ability to serve the analytics and BI platform needs of its customers, and its competitive place in the market for these offerings. Lower-scoring vendors may remain as stand-alone operating companies, but serving a niche market only. Five of Logi’s reference customers had plans to either discontinue or reduce their use of its platform. This sentiment may not be representative of the total customer base. (Note that the survey closed prior to the announcement of Logi’s acquisition, so this cannot have affected customer perception.)
- No specific cloud offering: Logi Platform can run within Amazon Web Services (AWS) and Microsoft Azure. Logi does not offer a direct cloud offering and neither manages nor hosts customer data. As part of OEM or embedded deployment, the Logi Platform can be deployed in the cloud.
- Customer experience: While reference customers scored Logi’s operations (product quality, support, and migration experience) as above average, in other areas of customer experience the evaluation was lower. The company was evaluated as lowest of all vendors for customer enablement, with only documentation being above the survey average, reference scores for other services (user conference, online community, training, skills availability) all placed it in the lowest quartile. From the point of view of business benefit outcomes, reference customers also evaluated Logi Platform in the lowest quartile. These results are markedly down on those from the 2017 Magic Quadrant; perhaps reflecting disquiet in Logi’s installed base following a reduction in its salesforce (as evident on LinkedIn) and as it pivots to focus on the OEM/embedded analytics business.
Looker is a modern analytics and BI platform that enables users to integrate, explore and visualize data. Looker is primarily deployed in the cloud, but can also be deployed on-premises. Core to its approach is its data modeling language, LookML, in which data analysts write code to define business metrics and manipulate data. The platform supports a wide range of data sources and visualizations, and can be embedded in websites, portals and applications.
Founded in 2012, Looker is venture-capital funded and received an additional $81.5 million in 2017, bringing the total funding to date to $177.5 million.
Looker positions itself as a “data platform,” although it is not a database storage platform and doesn’t have its own in-memory engine. It supports a range of user types along with applications including BI, marketing analytics, sales analytics, web analytics and product analytics. Key elements of the platform introduced in 2017 are Viz Blocks, which are predefined visualizations, and Data Blocks, which are data sources combined with data models and include weather data, demographic data, key economic indicators and geographic mappings. Looker also supports “actions” by integrating with a range of applications to take action within them. Examples of such actions include sending data to Amazon Simple Storage Service (S3) storage or sending an email through the SendGrid email platform.
New to the Magic Quadrant in 2018, Looker is positioned in the Niche Players quadrant — showing a good balance between Completeness of Vision and Ability to Execute.
- Agile product that supports a wide range of use cases: Looker’s reference customers report the highest proportions of users in each of the key use cases, with 93% using it for decentralized analytics, 75% using it for agile centralized BI, and 70% for governed data discovery. Looker customers like the product, find it flexible to use, and expect to use it for a range of types of analysis. The product includes its own language, LookML, which is based on SQL but adds the potential for greater reusability and better organization of code, and sophisticated analytics. While most of the products in this Magic Quadrant offer their own in-memory analytic engines, Looker differentiates itself with its in-database approach, particularly for newer analytics databases such as Google Big Query and Amazon Redshift.
- Operations and support: Looker’s reference scores placed it joint top (along with Sisense) of all the vendors in this Magic Quadrant for its operations, which includes support, product quality and migration experience. Looker’s scores placed it in the top quartile for support overall, and also in terms of level of expertise, response time and time to resolution. Unlike many other vendors here, customers can contact Looker support via email or online chat, with no need to fill in forms or quote ticket numbers. All support analysts are trained to the same level, with no distinct support tiers. Looker was placed in the top quartile for sales experience.
- Market momentum: Existing customers are growing their implementations, based on Looker’s customer and user retention figures. The vendor now reports more than 1,200 customers, a 56% growth rate year over year. With the additional round of funding, Looker has been growing its employee head count aggressively: about 400 as of December 2017, a roughly 70% growth rate year over year. Interest in searches on Looker from Gartner clients has also increased substantially year over year.
- User enablement: Startup vendors often invest in user enablement programs later in the company’s history, yet Looker has prioritized this. Looker’s reference customers’ scores place it as above average for user enablement. Customers also report a high level of skilled resources available from both the vendor and the market, and LookML is built on SQL so is easy for SQL developers to learn.
- Ease of use for business users: A third of Looker’s reference customers reported that ease of use for business users was a limitation to its wider deployment. This was the highest proportion of any vendor in this Magic Quadrant. Those who can use Looker like it a lot, but the code-based approach for authoring is less accessible to the mass market of business users than the GUI-based approaches that are driving much of the net new buying here. Looker’s composite ease-of-use score also placed it last of all Magic Quadrant vendors.
- North America focus: As a relatively new player, Looker is expanding internationally at a healthy rate but is not yet the global player that some of its competitors are. Its only direct international office presence in Europe is in the U.K. and Ireland. Documentation and support are only available in English.
- Product challenges: Looker’s platform lacks advanced features such as advanced algorithms, automatic insight generation and natural-language integration, and has some gaps in one of the most important critical capabilities — interactive visual exploration. Machine-learning support for navigation, exploration and content building are on the product roadmap. Poor performance deterred 8% of Looker’s reference customers from further deployment, the second-highest level of concern in this area. In part, this is where Looker’s product approach to leverage only in-database puts it at the mercy of how well-tuned that underlying database is.
- Narrow product roadmap: Looker has a number of missing areas in its product roadmap, which is less focused on differentiated functionality. For example, while augmented data preparation is part of the roadmap, augmented insight generation, natural language query and generation, and a marketplace for developers are all gaps. In addition, while cloud deployments are a selling point for new deployments, the ability to manage users and content across cloud and on-premises cohesively is a gap in the product roadmap.
Microsoft Power BI offers data preparation, data discovery, interactive dashboards and augmented analytics via a single product. It is available as a SaaS option running in the Azure cloud or, new in 2017, as the on-premises option Power BI Report Server. Power BI Report Server allows users to share reports (but not dashboards) and lacks some of the machine-learning capabilities found in Power BI SaaS. Power BI Desktop can be used as a stand-alone, free personal analysis tool and is also required when power users are authoring complex data mashups involving on-premises data sources.
Microsoft Power BI Pro’s list price is $9.99 per user per month, making it one of the lowest-priced solutions on the market today. In 2017, Microsoft introduced Power BI Premium, with a starting price of $4,995 per month (price depends on scalability and concurrent usage requirements). Power BI Premium acts as a virtual server in the cloud, allowing for greater storage and data refresh frequencies than the Pro version, and without individual named user licenses.
Microsoft is positioned in the Leaders quadrant again this year, with continued strong uptake of Power BI, and high levels of customer interest and adoption. Microsoft has clear and visionary product roadmap that includes vertical industry content.
- Low-priced incumbent: Many organizations own Microsoft Power BI, often through enterprise software agreements; so, even when Microsoft is not yet deployed, the vendor has changed the modern analytics and BI shortlist from a greenfield evaluation to being the incumbent. Microsoft significantly grew its subscriber base in 2017. Microsoft has put downward pricing pressure on the analytics and BI market with its low per-user and now virtual server subscription price. License cost was the second most important reason for reference customers choosing Microsoft Power BI, with 12% of them citing this as a reason for selecting it.
- Ease of use and visual appeal: Microsoft’s customer reference scores place it in the top quartile for ease of use, with 14% of customers citing this as the main buying criterion. Power BI’s reference scores also place it in the top quartile for visual appeal. Winning customers within the first few minutes has been part of Microsoft’s “five by five” strategy — five seconds to sign up and five minutes to “wow” the customer. There are a number of features within the product that contribute to its overall ease of use, including its primarily cloud deployment model. Microsoft was early to the industry in terms of a natural language/search interface with Microsoft Q&A (Question and Answers), which allows users to create a visualization using search terms.
- Product vision: Microsoft continues to execute on its roadmap. Microsoft Quick Insights (a feature within Power BI) is a basic form of augmented data discovery, initially available in the cloud service. In 4Q17, however, it was also introduced in the desktop client for variance analysis — to identify what has most contributed to changes in sales, for example, from one quarter to the next. The conversational chatbots with Microsoft Cortana also are differentiators, and Microsoft Surface Hub is touch-enabled and supports the display of Power BI content on wall-size displays in executive board rooms. The ability to provide a closed-loop process from insight to action is supported by the integration of Power BI with Microsoft Flow and within its business application, Microsoft Dynamics 365. Virtual reality and Power BI integration with Holo Lens are currently works in progress.
- Customer experience: Microsoft and Sisense shared the highest customer reference scores in this Magic Quadrant for achievement of business benefits. In particular, in terms of making better information and insights available to more users. The high achievement of this business benefit is enhanced by both the low price and cloud deployment. User enablement and the availability of skilled resources are other elements of the overall customer experience for which Microsoft is evaluated as above average. The vendor has a strong community of partners, resellers and individual users; and this community extends the product with prebuilt apps, visualizations and video tutorials, in addition to the content provided directly by Microsoft.
- Mode 2 analytics, Azure cloud only: Power BI has mainly focused on the requirements for Mode 2 (agile, self-service) analytics, while its on-premises SQL Server Reporting Services serves the needs of Mode 1 (for scheduled, distributed reports). For Microsoft customers, this has resulted in a two-product deployment with different capabilities and different deployment approaches. Microsoft’s reference customers cited absent or weak functionality (14%) as a limitation to wider deployment, with 27% (the highest percentage of respondents for this Magic Quadrant) citing a range of other problems. These other problems include frequent updates disrupting functionality and documentation that does not match the release cadence; also, intermittent problems with the Enterprise Gateway used to connect live to on-premises data stores.
- Breadth of use: As previously, Microsoft’s scores from its reference clients place it in the bottom quartile for breadth of use. Breadth of use looks at the percentage of users using the product for a range of BI styles: from viewing reports, creating personalized dashboards and doing simple ad hoc analysis, to performing complex queries, data preparation and using predictive models. The majority of Microsoft’s reference customers (59%) mainly use Power BI’s parameterized reports and dashboards, rather than using it for more complex tasks. This pattern of use suggests that the more advanced data preparation is performed outside Power BI and/or that IT is building common dashboards for many to consume. The average proportion of business users authoring their own content with Microsoft Power BI is 20%, in the bottom quartile of vendors for this Magic Quadrant.
- Multiple products: The core of Microsoft Power BI is a self-contained product. However, a number of elements in Microsoft’s roadmap that are part of the product vision span multiple products and require integration. For example, insight to action requires Microsoft Flow. The Data Catalog is a separate product and curated datasets may be provided from Microsoft directly or through a partner. In addition, some capabilities are not native, but possible through SharePoint or Microsoft Teams. NLQ via a search box is supported natively in Power BI, but more robust voice and conversational analytics is supported through the Cortana personal digital assistant. Data scale-up options have improved, with larger models supported in the Premium version, but the scale-up options to Azure Data Lake, Azure SQL or to on-premises storage are not straightforward. While Microsoft owns the intellectual property for a number of elements of vision, a turnkey integration is a work in progress.
- Sales experience: The majority of Microsoft’s reference customers evaluated their sales experience as good to excellent. However, in relative terms, Microsoft is slightly below average for this Magic Quadrant. In part, this may be because Microsoft Power BI has lacked a dedicated analytics and BI salesforce, which changed in mid-2017. While Microsoft Power BI is a low-priced option, clients expressed frustration about changes in the pricing and packaging and the lack of clarity about what Power BI functionality is included with Microsoft Dynamics 365. Microsoft Dynamics 365 uses Power BI embedded for reports and dashboards, but non-Dynamics 365 content requires a Power BI Pro or Premium license.
MicroStrategy Version 10 combines self-service data preparation, visual-based data discovery and exploration and native big data connectivity with enterprise analytics and BI.
MicroStrategy’s fully featured interactive visual exploration experience is combined with best-in-class enterprise reporting and mobile capabilities delivered in a single, fully integrated platform and workflow. This makes it better-suited to large-scale SOR reporting and governed data discovery deployments for large and complex datasets than most other offerings in this Magic Quadrant.
Point releases during the past year (most recently 10.9) delivered a redesigned visual-based exploration and a new desktop version with quick-start user tutorials for a fast time to insight for new users. MicroStrategy has introduced the concept of a dossier, for a new dashboard experience that makes it easier to find and navigate analytics content. An expanded set of APIs and new software development kit (SDK) data connectivity capabilities are intended to make the platform more attractive for embedded and OEM use cases. MicroStrategy also added the ability to watermark and certify content, and for users to collaborate through discussion threads, among other enterprise feature enhancements.
MicroStrategy moved into the Challengers quadrant this year (from the Visionaries last year). Its improved Ability to Execute is due to its leading product capabilities scores across all use cases, combined with successful investments in customer and operations experience initiatives that have translated into material improvements across most of these measures. However, while there are signs of improvement, and MicroStrategy has new marketing and sales initiatives focused on attracting new customers, limited market momentum and market awareness outside of its installed base continue to hinder its success.
- Strong integrated product for all use cases: MicroStrategy has among the highest product scores of any vendor in this Magic Quadrant, both overall and for all the evaluated use cases. It gained “excellent” to “outstanding” scores for BI administration, architecture and security; data source connectivity; scalability and model complexity; metadata management; mobile; and ease of use, visual appeal and platform workflow integration. For mobile BI in particular, MicroStrategy has been an early innovator, with some of the most comprehensive (including transactional), highly rated and widely adopted capabilities. A higher percentage of MicroStrategy’s reference customers reported selecting the platform for its mobile capabilities than for any other vendor on this Magic Quadrant.
- Agile yet governed enterprise deployments: MicroStrategy 10 has a seamless workflow for promoting business-user-generated data models and content to enterprise sources, while leveraging enterprise features to enable large-scale trusted self-service. Almost 54% of MicroStrategy’s customer references report using the platform for governed data discovery, placing it in the top quartile for this use case. Advanced data manipulation, enterprise-grade security, native Hadoop access and in-memory columnar data store (PRIME) give business users a highly interactive and comprehensive data exploration experience for very large and complex datasets and models.
- Large centralized deployments with high standardization rates: MicroStrategy’s reference customers report average deployment sizes in the top quartile for this Magic Quadrant, with two-thirds of them using the platform as the enterprise standard — higher than most other vendors evaluated here. Due to its enterprise features and traditional BI heritage, the majority of MicroStrategy customers use the platform in centrally managed deployments, either agile (65%) or traditional IT-centric (59%). This is more than most other vendors in this Magic Quadrant, with a below-average percentage using the platform for decentralized deployments.
- Customer experience and operations: More than two years of investment in customer-focused initiatives — a new customer success organization, proactive support packages and a new community portal, among others — have finally begun to pay off for MicroStrategy. Reference customer scores now place MicroStrategy in the top quartile of these vendors for customer experience, and above average for operations (a significant year-over-year improvement). With the exception of migration experience, which remains below average, above-average scores for support and being placed in the top quartile of vendors for product quality represent important improvements for MicroStrategy’s customers. Key elements of MicroStrategy’s user enablement capabilities continue to progress. Updated content is delivered through the new user community, coupled with the redesigned desktop offering with new embedded-user quick-start guidance.
- IT involvement for enterprise deployments: Streamlining the configuration, administration and migration of enterprise deployments are areas of ongoing investment for MicroStrategy. However, while reference customers report improvements across most areas of ease of use, the same is not true for implementation services, ongoing administration and content development, which require more significant IT involvement. Reference customers for MicroStrategy report a longer time to develop simple, moderate and complex analytics content than for most other platforms evaluated.
- Limited traction beyond the installed base: MicroStrategy Version 10 is increasingly considered a viable option for agile self-service and content authoring in the MicroStrategy installed base, as an integrated, governed alternative to its competitors. However, despite having a strong product, Gartner sees MicroStrategy on new buyer shortlists at markedly lower rates than the Leaders in this Magic Quadrant. Awareness of the product and its differentiators is limited with new buyers. Moreover, sales growth and momentum have been negative, although there is the potential for improvement — with renewed marketing investments by MicroStrategy, the redesigned desktop and an improved land-and-expand sales process.
- Product vision: MicroStrategy’s roadmap includes a number of high-priority investment areas. However, the major focus continues to be on expanding enterprise features, embedded analytics, and data size and complexity, rather than on disruptive innovation and AI-driven user experiences. These latter two are likely to be the defining features of analytics and BI during the next three to five years, and MicroStrategy is a follower here.
- Gaps in cloud and augmented analytics: During the past year, MicroStrategy released third-party integrations with Automated Insights and Narrative Science for NLG, and has made some investments in chatbot integration. However, augmented data discovery features, such as automated insight generation and NLQ, are still lacking in the current product (though there is some evidence of them, particularly NLQ, on the product roadmap). MicroStrategy’s single-tenant cloud solution lacks packaged domain and vertical content, and a robust content marketplace for customers and partners. Although MicroStrategy was early to invest in the cloud, it also has among the highest percentage of its reference customers reporting that they have no plans to consider deploying it in the cloud. MicroStrategy’s new free desktop version, coupled with point-and-click easy AWS provisioning of new departmental deployments, could not only improve its ability to land and expand and attract new customers, but also increase cloud adoption for new customers in the future.
Oracle offers a broad range of BI and analytic capabilities, both on-premises and in the Oracle cloud. During the past year, Oracle has rebranded its cloud offerings, Oracle Analytics Cloud (OAC), with new pricing and packaging, creating one access point for all analytics components. Oracle Data Visualization (ODV), the focus of this Magic Quadrant, offers integrated data preparation, data discovery (with advanced exploration) and interactive dashboards via a single design tool supporting both desktop and web-based authoring. ODV is available as a free download for personal use or as part of departmental and enterprise OAC offerings. ODV is also available as an optional component to Oracle Business Intelligence 12c deployed on-premises.
During the past year, Oracle delivered a number of product enhancements. Importantly, it closed many basic feature gaps in ODV’s data preparation, interactive visualization and dashboarding. The initial user experience and time to insight is improved in terms of innovation. Oracle has introduced natural-language search via text, voice and chatbots, and well as NLG to explain insights in dashboards. On the mobile front, Oracle released Day by Day and Synopsis. Day by Day learns users’ behavior and presents them with the most important insights into the user’s context, while Synopsis uses augmented reality to capture data for analysis.
In addition to rebranding, Oracle made major pricing changes for OAC, introducing a universal credits model with streamlined packages and two SKUs (departmental and enterprise) with the option to pay as you go and reallocate subscriptions monthly.
Oracle is positioned to the right in the Niche Players quadrant because it has filled many product gaps for a modern analytics and BI offering. Oracle has invested early in, and delivered, a number of augmented analytics capabilities, and has additional visionary elements on the product roadmap. Even though Oracle is expanding its market appeal and has delivered next-generation differentiators, it is following the innovators rather than leading. Also, its sales and marketing strategy has not yet translated these improvements into material market awareness beyond its Oracle customer base.
- Broad use-case support optimized for Oracle environments: Customer references report using OAC for a range of analytics use cases — from decentralized to governed and centralized deployments. It appeals to IT departments that have implemented Oracle’s traditional BI platform Oracle BI 12c. Lines of business that have deployed Oracle BI SaaS operational reporting on top of Oracle enterprise applications also find ODV attractive. This is particularly so given the growing set of domain-specific Oracle E-Business Suite content packs for ODV (separate from Oracle BI packaged apps) that are made available free. Users are also able to leverage Oracle’s Essbase Service, which is now natively supported as part of ODV’s data preparation capabilities and is included with both OAC departmental and enterprise packages.
- Flexible, global and hybrid cloud offerings: ODV can be deployed on-premises or in Oracle’s global cloud. It has the ability to directly query on-premises data from the cloud or migrate and extend on-premises data models and content to the cloud (and vice versa). OAC components are Oracle-cloud-centric; Oracle does not and will not support multicloud deployments. However, Oracle’s support for hybrid cloud deployments and data gives its on-premises BI customers, and any customer with on-premises data, a glide path to transition to the Oracle cloud. Cloud offerings is a top reason why reference customers report selecting Oracle for analytics and BI. More than 90% of on-premises reference customers of Oracle 12c with ODV option are either deploying or planning to deploy in the cloud.
- Interactive exploration, dashboards, mobile and augmented analytics: ODV offers an integrated design experience for interactive analysis, reports and dashboards. In addition to offering core visual exploration features for light analysis, ODV supports advanced exploration — including custom groups, and drag-and-drop advanced analytic functions such as forecasting, clustering, trending, outliers, and so on. Augmented analytics features that include NLQ, automated insights and natural-language narration, as well as the mobile application Day by Day, differentiate Oracle from most other vendors in the market. The availability of these features should improve the below-average customer reference scores for complexity of analysis as they are adopted.
- Product vision: Oracle is making significant investments in augmented analytics, with text and voice for NLP as well as chatbots, and has made early investments in virtual reality. It has implemented these capabilities across data preparation in order to understand distribution of data and correlations, insight generation for identifying significant segments, clusters, drivers, outliers and anomalies and for narrative findings and prescriptive action with more on the roadmap. An integrated, agile data catalog leveraging these capabilities is on the near-term roadmap.
- Sales experience: Oracle introduced new pricing and streamlined packaging in late 2017, intended to simplify subscription/deployment and make it more flexible. Oracle has also put in place a land-and-expand sales approach to line-of-business people. However, customer reference scores for Oracle place it in the bottom quartile of the vendors for sales experience, and broader customer reaction to the new pricing and packaging is still too early to assess.
- Customer experience and operations: Overall, Oracle’s reference customer scores place it in the bottom quartile for operations and below average for customer experience, but results vary widely by product. References for the ODV Cloud Service product score the platform as above average across all operations measures (product quality, support and migration), and for business benefits, user enablement and availability of skills both in the market and from the vendor. That said, they placed it in the bottom quartile for OAC and Oracle BI 12c with the ODV option, which brings the average for Oracle on these metrics to below the survey average. Moreover, a top quartile percentage of Oracle BI 12c customers (10%) say that difficulty in platform implementation is a problem — more than twice the percentage of OAC customers. At the product level, there were fewer than 25 customer references for OAC, so these statements are directional.
- Oracle-centric appeal: Oracle’s modern BI capabilities are primarily deployed in organizations that also use its enterprise applications and data management technology. Oracle has the highest percentage (of any vendor in this Magic Quadrant) of its reference customers (at 43%, more than double the next-highest vendor) having standardized on Oracle enterprise applications. Likewise, 78% of its customers have standardized on Oracle as their primary enterprise data repository for analytics — almost triple the next-highest vendor. OAC is, to date, largely unproven outside Oracle’s customer base.
- Product gaps: ODV has closed many feature gaps with stand-alone competitive offerings. However, gaps remain in self-service data prep (such as the ability to combine multiple tables in one data connection), publish and share (scheduling and alerting, and formatting without having Oracle BI 12c). Moreover, unlike Oracle’s traditional BI deployments, average deployment sizes for reference customers of ODV are below the survey average.
Pyramid Analytics offers an integrated suite for modern analytics and BI, with a broad range of analytics capabilities that includes ad hoc analysis, interactive visualization, analytic dashboards, mobile, collaboration, automated distribution and alerts. The solution is well-suited to governed data discovery through features such as BI content watermarking, reusability and sharing of datasets, metadata management and data lineage.
In 2017, Pyramid introduced its new, next-generation platform, Pyramid 2018, with a refreshed user experience, more agnostic data and platform integration, a new query and analytics engine and enhanced analytics capabilities. The new platform is designed around six primary activities — model, formulate, discover, illustrate, present and publish — that represent an integrated workflow of modern analytics and BI with a focus on self-service and user empowerment. The system preserves strengths from previous versions, such as robust data lineage and metadata management, while clearly improving the user experience for information exploration. Pricing is now based on a subscription model, charged per user and month, for a minimum of 12 months.
Pyramid Analytics is in the Niche Players quadrant, due to shortcomings in customer experience and continued low market responsiveness. The product offering strategy and a focus on platform rebuild, rather than an ongoing innovation development cycle, also affect its position.
- Broad range of use cases: Pyramid continues to emerge as a balanced platform fit for multiple purposes and leveraged in a broad range of use cases. Traditional IT-centric reporting (51%), governed data discovery (at 64%) and agile centralized provisioning (at 62%) are the main use cases reported by customers. Customers leverage a broad range of capabilities, leading to a breadth of use in the top quartile of all the vendors in the Magic Quadrant. Moreover, use of the offering for extranet analytics solutions is also a popular option, with 22% of reference customers claiming this use case.
- Integration with Microsoft’s environment: Pyramid Analytics has, like last year, a high percentage of deployments on top of Microsoft-based enterprise data warehouses (EDWs), at 70% of its customer references. This is even higher than Microsoft’s own result (60%). It is also one of the top platforms used with Microsoft’s ERP and CRM solutions, with percentages roughly equivalent to Microsoft in both cases. Pyramid continues to offer a tight and extensive integration with the Microsoft environment. It should therefore be assessed when that is a top requirement, or in a multi-infrastructure environment where Microsoft plays a key role as the platform matures its support for other data infrastructure environments.
- Integrated Mode 1 and Mode 2 capabilities: Pyramid supports agile workflows and governed, report-centric content within a single platform and interface. Specifically, the product’s self-service data preparation capabilities and publish, share and collaborate capabilities receive some of the highest product scores and Pyramid 2018 is considered “excellent” to “outstanding.” Reports and dashboards support scheduling and distribution, alerting and discussions, whereas these capabilities are often lacking in modern analytics and BI products.
- Easy to implement, administer, support and migrate: Pyramid’s system administrators have a positive opinion of the product. When asked about problems with the platform, no reference customers considered Pyramid difficult to implement (one of the top survey results), while 64% don’t identify any type of problem with the solution (putting this vendor in the top quartile). Moreover, customer reference scores place the platform in the top quartile of this Magic Quadrant for ease of use for platform administration, and put Pyramid Analytics in the top quartile for overall support. Migration experience is also largely positive (and in the top quartile). Customers often choose the product for its ease of use for content authors, and its reference scores here are above average.
- Product vision and innovation remains low: Pyramid scores low on both innovation and product roadmap. The company’s 2017 roadmap was squarely focused on rebuilding the product to deliver an agnostic platform and a modern user experience that is capable of thriving beyond Microsoft’s BI customer base. While these are important initiatives, such investments are neither innovative within the market nor centered around future buying trends.
- Challenges in customer experience: Reference customer scores place Pyramid in the bottom quartile of vendors for customer experience (achievement of business benefits, user enablement and skilled resources); a recurrent result for Pyramid from the previous year. The limited availability of skilled resources in the marketplace in general is a bigger problem, one not easily addressed with Pyramid’s historical Microsoft-centric focus. In addition, while Pyramid scores well on its ease of administration, arguably, the ease of use for content consumers and visual appeal are more important buying criteria. Here, Pyramid is placed in the bottom quartile.
- Future viability concerns: Reference customers continue to assess Pyramid’s viability in this marketplace as lower than that of its competitors, placing it in the bottom quartile, with a modest improvement over last year. This situation may continue to be driven by the expanding Microsoft’s presence — which had been Pyramid’s sweet spot. In addition, Pyramid showed some of the highest intentions to discontinue its use, with 12% of its reference customers planning to either discontinue or reduce their usage and maintenance. The renewed platform, with improved user experience and more agnostic platform connectivity, may be able to revert this trend if Pyramid is able to reach beyond its current Microsoft-focused customer base.
- Lack of differentiation in a crowded market: In the modern analytics and BI space, vendors need to excel on key characteristics in order to differentiate and attract new customers. The ability to serve as the on-premises server component to deliver Microsoft Power BI content was a way for Pyramid Analytics to engage organizations with a unique offering. With the market changing, and Microsoft slowly becoming autonomous in the delivery of fully on-premises analytics and BI content, Pyramid must leverage its platform-agnostic features and further differentiate itself to stand out from other vendors and compete. It is not currently clear if the retooling process that began with Pyramid 2018 will offer any more than the standard requirements across a wide range of features and data source connectivity. Product evolution will need to continue at a fast pace, including the addition of augmented analytics functionality, as the market continues to change and once again shifts to new capabilities to differentiate and better support the analytics process.
Qlik offers governed data discovery and agile analytics and BI via its lead product Qlik Sense. The Qlik Analytics Platform supports developers in creating customized applications and for the embedded use case. QlikView continues to be enhanced and makes up a larger portion of the company’s installed customer base, while Qlik Sense now accounts for more than 50% of license revenue.
The scalable in-memory engine allows customers to build robust, interactive, visual applications. Some customers opt to use the engine as a data mart, in lieu of traditional data warehousing. Qlik NPrinting is an optional server component that supports Mode 1 BI with report distribution and scheduling. In January 2017, Qlik acquired Idevio to bring geoanalytics capabilities as an optional add-on. Both NPrinting and GeoAnalytics were initially developed by partners.
Qlik’s position in the Leaders quadrant is driven by progress on its roadmap for augmented analytics, improvements in marketing strategy, and ease of use. Its market execution is poorer than that of the other Leaders, largely due to its relatively low momentum, slightly lower product success, and its operations scores.
- Scalable product for robust applications: Customers will often use QlikView and Qlik Sense as a type of data mart, because the Qlik Associative Engine supports multiple data sources, complex data models and complex calculations. Although self-service analytics and BI reflect a large portion of current buying, Qlik’s scripting engine also supports complex data transformations from multiple data sources in support of the agile, centralized BI provisioning for interactive dashboards that many consume. While the vendor has continued to invest in point-and-click capabilities to load data, this scripting engine also means developers are not limited to menu-driven options — providing traceability for load and transform. The Qlik Associative engine is shared between QlikView and Qlik Sense. Enterprise governance features remain a strength and differentiator.
- Differentiated marketing: The Qlik Associative Engine has been a product differentiator since its inception, but not one that Qlik has articulated well in the past — sometimes referring to it as “the power of gray.” Customers often only understood this unique capability after having deployed the product. During the past year, Qlik has done more to provide clear business examples; improving data literacy as part of an overall analytics and BI program has been a key message during this time. This message goes beyond just product capabilities, with the vendor sponsoring virtual events, meet-ups and blogs, and providing education services on this topic. Also, with the rise in interest in data for good efforts, it is noteworthy that Fast Company magazine recognized Qlik as one of the Top 10 innovative companies for Social Good. Qlik’s Change Our World Corporate Social Responsibility (CSR) program, launched in 2010, now provides free software and services for more than 300 not-for-profit organizations.
- Product vision: Qlik was early to market with some elements of product vision. Qlik’s marketplace allows partners to develop content to further extend the platform or monetize prebuilt industry vertical applications. Success in the marketplace has been the source of some of Qlik’s acquisitions — such as NPrinting and Idevio. Qlik DataMarket also represents curated public datasets in a ready-to-consume fashion. This vendor has further executed on its augmented analytics roadmap, ability to analyze both at-rest data and streaming analytics, multicloud, big data, and its crowdsourced recommendation engine.
- Partner network: With a network of more than 500 system integrators and 1,700 partners around the world, an estimated 70% of Qlik implementations are partner-led. These partners often have long-term relationships with their customers and understand their particular requirements. Qlik partners also contribute product extensions, prebuilt content and training, via either the marketplace or the community. Qlik reference customers scored the availability of skilled resources in the market as above average. While Qlik’s partner network is an asset overall, it has been slow to adopt Qlik Sense over QlikView (for which they have more experience).
- Self-service: One of QlikView’s key attributes was its rapid implementation approach for exploratory dashboard applications, rather than self-service data preparation and analysis. Qlik Sense, meanwhile, was about modernizing the interface and making it more open and extensible; but it also was intended to bring more self-service analytics. To date, this has only partly been addressed. The ease with which a user can complete a full analytic workflow (access a new data source, combine multiple data sources, then visually explore the data) using Qlik Sense is weaker than for its chief competitors. Complex joins require scripting, and once data is loaded into Qlik, the explore and dashboard design process is still largely a distinct two-step process. Qlik Sense continues to lack a point-and-click formula editor or the ability to create calculations while exploring and visualizing.
- Cost of software: Existing QlikView customers do not get the Qlik Sense product as part of maintenance, unless they migrate. Customers rarely migrate, because QlikView continues to be enhanced. Also, the pricing of the two products is drastically different. Qlik Sense has primarily a user-based model (although one token can be shared by multiple users who do not log on frequently), while QlikView has more flexible pricing options but is often server-based. Qlik is trialing new upsell options that cover both products. Of Qlik’s reference customers, 27% cited cost of software as a limitation to wider deployment (it has been in the top quartile for this limitation for multiple years). In a market with downward pricing pressure, cost will continue to be a challenge unless Qlik can articulate its value-added differentiators more clearly. In common with several other vendors in this Magic Quadrant, Qlik is transitioning to a subscription-based pricing model.
- Slowed mind share/momentum: Within this market, Qlik once accounted for the top one or two Gartner-client interest metrics based on inquiry and searches on gartner.com; for 2017, it now is a distant third. Because Qlik is now privately held, it does not report total revenue growth, but its global head count was reduced by 2.8% year over year, through 3Q17. Qlik had an additional 10% workforce reduction in January 2018, primarily in sales, as it shifts its emphasis to enterprise sales with partners leading sales in small or midsize organizations. Changes in executive leadership, particularly the CEO, CTO and VP of Sales, are points of concern.
- Migration challenges: Qlik’s reference customer scores for overall support are above average, with a slight improvement year over year. However, its scores for satisfaction with the migration experience place it in the bottom quartile, a decline year over year. Migration can include an upgrade within product lines as well as from QlikView to Qlik Sense. Because Qlik’s primary market strategy has been for Qlik Sense and QlikView to coexist, migration tools have been largely absent until the June 2017 release. While the load scripts and data models are common between the two, the dashboards are essentially a redesign. To an extent, this is appropriate given how different the products are, but it remains a practical challenge. Eighteen percent of Qlik’s customers plan to discontinue or reduce using QlikView, and Qlik Sense will not necessarily be the replacement. The lack of simple migration tools means that customers may well switch vendors instead.
In November 2017, Salesforce rebranded its analytics portfolio from Einstein Analytics to Salesforce Analytics. It includes basic operational reports and dashboards for Salesforce data, the Einstein Analytics Platform (formerly Wave), Einstein Analytics-based packaged applications and Einstein Discovery. The Einstein Analytics Platform is for creating point-and-click interactive visualizations, dashboards and analysis with integrated self-service data preparation for Salesforce and non-Salesforce data. Einstein Discovery is an augmented analytics platform based on Salesforce’s acquisition of BeyondCore in 2016. It leverages machine learning to generate smart suggestions for how to prepare data, and then automatically finds, visualizes and narrates important insights in a story for each user, without requiring them to build models or write algorithms.
During the past year, Salesforce continued to improve the integration, workflow and embedding of Einstein Discovery with the Einstein Analytics Platform and within its packaged applications. It has also introduced the beginnings of an NLQ capability within a dashboard, with analytic functions such as “rank.” In November 2017, Salesforce previewed a new product in pilot, Einstein Data Insights, which delivers automated insights based on Einstein Discovery for Salesforce basic operational reports.
The combination of Einstein Discovery’s disruptive augmented analytics capabilities, together with Salesforce’s vision for an integrated portfolio, global presence, partner network, strong positioning, and marketing and sales execution potential, places Salesforce in the Visionaries quadrant. Salesforce initially caters and appeals to its installed base, but continuing to execute on its next-generation, machine-learning/artificial intelligence (AI)-enabled roadmap could make it a more significant player in the market overall during the coming year and beyond.
- Leading in augmented analytics: Salesforce has been leading the market in augmented analytics. The acquisition and integration of Einstein Discovery (augmented analytics) with the Einstein Analytics platform (formerly Wave) — embedded in Einstein Analytics-based analytic applications and as an AI augmentation to basic Salesforce reports and dashboards — have the potential to make prescriptive, AI and machine-learning-driven insights pervasively available to users across the enterprise. In addition, Einstein Data Insights (currently in “pilot”) represents a compelling vision. It has the potential to extend the reach of augmented analytics to operational frontline workers. To facilitate trust in autogenerated models and broader adoption, Einstein Discovery exposes the key drivers of insights to users and makes the underlying model and R code available for data scientists to further validate, export and extend.
- Optimized for Salesforce: Reference customers evaluate Salesforce Analytics as visually appealing and easy to use for Salesforce business consumers. Although it can support non-Salesforce data, Einstein Analytics is natively integrated with Salesforce security, collaboration and metadata, including simplified access to Salesforce application tables through an intuitive wizard. Einstein Analytics also supports extensive usage monitoring across Salesforce products as well as autogeocoding and custom maps for Salesforce data. Users can invoke Salesforce actions from within Einstein Analytics (such as data quality, new campaigns and targeted outreach) and can collaborate using Chatter. Einstein Discovery can be used to access, model and generate insights, visualizations and narratives from Salesforce and non-Salesforce data that can be further modified and embedded in Einstein Analytics dashboards and packaged applications. While the workflow of the integration across the Salesforce Analytics portfolio is a work in progress, it continues to improve at a rapid pace with each new release.
- Partner ecosystem and marketplace: Salesforce Analytics’ capabilities for embedding analytic content are a platform strength. These are leveraged by a robust partner ecosystem that includes ETL, data science and machine learning vendors, independent software vendors and system integrators. Its developer marketplace, AppExchange, provides a platform for independent software vendors/developers to build and sell custom content (such as datasets, lenses and applications). An above-average percentage (29%) of Salesforce Analytics’ reference customers report using it for an OEM or embedded use case.
- Market understanding: Making more advanced types of analysis on more complex data are top requirements in this market. As evidence of Salesforce’s alignment with these market trends, a top quartile percentage of its reference customers report using the platform for more complex types of analysis; results that are driven, in part, by the expanded adoption of Einstein Discovery. Ease of use for content consumers (also evaluated by reference customers as above average) and data access and integration are cited as the top two reasons why reference customers select Salesforce Analytics.
- Einstein branding: Salesforce has acquired 12 AI-centric companies since 2014, which form the foundation of its new AI-enabled customer-facing services and applications — and all branded Einstein. However, the various Einstein-branded products based on different acquired platforms are marketed, sold and packaged as separate optional components. This is confusing and potentially costly for buyers. For example, Predictive sales forecasting via Einstein Forecasting, an embedded application, and myEinstein, which are sold as optional embedded capabilities within Salesforce applications. (MyEinstein is a business-user-oriented point-and-click development environment for building embeddable machine-learning-driven models/applications without writing code or having extensive data science knowledge.) These capabilities are, however, separate from the Einstein Discovery platform and are not part of the Salesforce Analytics portfolio.
- Packaging encourages use with Salesforce data only: Although there is no product limitation for combining Salesforce with non-Salesforce cloud and on-premises data, to date, Salesforce Analytics has largely been deployed by organizations analyzing Salesforce data. Beyond the advantages derived from the Salesforce application-specific optimizations, this is in part due to the Salesforce Analytics’ pricing and packaging model. This model motivates buyers to purchase the lower cost, individual, Salesforce data only analytics applications, which are already considered expensive by Salesforce customers. When integration of non-Salesforce data is required, competing alternatives are more frequently considered, rather than the more expensive Einstein Analytics platform that includes all the analytic applications together with support for non-Salesforce data.
- Cost: Salesforce reference customers continue to cite the cost of its software as the main barrier to their broader deployment of Salesforce Analytics overall. Forty-nine percent of Salesforce reference customers cite cost as a barrier, a higher percentage than for any other vendor in this Magic Quadrant. This has negatively affected reference customers’ perception of their sales experience, which is also ranked as below average.
- Functional gaps: Data source connectivity, including newly introduced hybrid connectivity to on-premises data, is inconsistent across Einstein Analytics and Einstein Discovery. Native connectivity to enterprise applications other than Salesforce and Microsoft Dynamics CRM continues to require partners’ ETL tools. At the same time, some basic features — such as the full range of joins and the ability to create hierarchies within data preparation — are on the roadmap. Advanced data manipulation, such as creating custom groups, must be done in the data preparation interface rather than within a dashboard during analysis and exploration. Moreover, while ease of use for consumers is placed in the top quartile according to Salesforce reference customer scores, ease of administration, development and deployment are placed in the bottom quartile. Also, ease of use for content development has the lowest placement of all the vendors evaluated.
SAP delivers a broad range of analytics and BI capabilities for both large IT-managed enterprise reporting deployments and business-user-driven modern deployments. To support this it offers two distinct platforms: SAP BusinessObjects Enterprise for on-premises deployments; and SAP Analytics Cloud (formerly SAP BusinessObjects Cloud), as a purely cloud-based deployment, built on the SAP Cloud Platform. The vast majority of companies choosing to use SAP for their analytics and BI needs do so if they have standardized on SAP enterprise applications.
The SAP offerings evaluated in this Magic Quadrant are SAP Lumira (a module within BusinessObjects Enterprise) and the SAP Analytics Cloud solution. SAP Analytics Cloud is now the strategic product for modern analytics and BI; SAP Lumira will continue to be supported through 2024, with enhancements planned in 2018 but minimal investment thereafter.
SAP is placed in the Visionaries quadrant, where its position is affected by product limitations as well as the scores from its reference customers in the highly weighted customer experience categories. SAP still does not have the broad market momentum of the Leaders in this space. Its product vision has been strengthened — through the broader Leonardo digital innovation system approach, and specifically within its analytics and BI roadmap centered on the Analytic Hub. However, its sales strategy, marketing strategy and market understanding affect its relative positon against other vendors.
- Closed loop functionality and innovation: SAP Analytics Cloud’s integrated functionality for planning, analytical and predictive capabilities in a unified, single platform is differentiated. It is one of only two vendors in the Magic Quadrant with an offering scoped in this way. SAP has continued to strengthen Analytics Cloud in 2017, adding automatic narration to visualizations, automatic clustering to scatter and bubble charts, the ability to use R visualizations in data stories, and a native iOS app. The associated Digital Boardroom is a differentiator and is particularly attractive to executives because it includes “what if” analysis and simulations. With Digital Boardroom, SAP can leverage its strategic position with large enterprises and also protect its installed base against smaller vendors with less access to (and visibility with) senior executives. SAP’s approach to innovation has included co-innovation with customers and improving its workforce diversity; 25% of leadership positions are now held by women, which is above the industry average. SAP’s diversity goal further includes culture and identity, cross-generational intelligence, and differently abled people.
- Prepackaged analytic content: Drawing on the business content approach it pioneered with SAP Business Warehouse (BW), SAP offers a growing library of prebuilt content available online for SAP Analytics Cloud customers. The free-to-download content covers 11 verticals and a wide range of line-of-business functions. It includes data models, data stories including visualization, template SAP Digital Boardroom agendas, and advise and recommendations on using data sources such as S/4HANA, SAP BW, Marketing (SAP Hybris Marketing or SAP Hybris Marketing Cloud), SAP SuccessFactors or SAP Hybris Cloud for Customer. These sources are further broadened by the growing ecosystem of partner-originated analytic apps and extensions on the SAP App Center online marketplace.
- Broad spectrum offering: The dual nature of SAP’s portfolio for modern BI means it can address the needs of different types of buyers. Lumira 2.0, with its improved discovery to design UX and simplified workflow is attractive to organizations looking to rapidly prototype and deploy dashboards, most often in order to draw more value from an existing SAP BI infrastructure — increasingly SAP HANA. As such, the top reasons customer references give for selecting Lumira are data access and integration and integration with enterprise apps. SAP Analytics Cloud is selected for quite different reasons, according to its reference customers — ease of use for consumers and cloud deployment (primarily) — clearly reflecting the nature of its value proposition.
- Cloud-centered, hybrid vision: SAP is aware that managing across its on-premises and private and public cloud deployments is a challenge for organizations. Its Analytics Hub is intended to deal with this complexity by creating a bridge between deployment types and providing a single front end for SAP and partner-originated analytic offerings. SAP intends to offer a hybrid analytics and BI pricing model from 2018. This will reinforce its product roadmap and will reflect the use of a mixture of deployment styles by organizations as this market matures. It is not year clear what the pricing level will be for the hybrid pricing model, currently in pilot. However, the fact that SAP is developing a hybrid model reinforces the commitment of its product roadmap to reflecting the maturation of organizations’ view on and use of a mixture of deployment styles within analytics.
- Functional limitations: Almost a quarter of reference customers using Lumira cite absent or missing functionality as a problem — the highest incidence of any product covered in this Magic Quadrant. SAP Analytics Cloud has “fair” or “poor-to-fair” functionality in scalability and model complexity, data source connectivity and ingestion, augmented data discovery and capability to embed analytic content. Lumira has fair or weaker functionality in advanced analytics for citizen data scientists, metadata management and the publish, share and collaborate capabilities.
- Product quality and performance: SAP’s reference customer scores give its product quality as the lowest overall, and more SAP customers cite software quality as a barrier to wider adoption than for any other vendor. Performance is also an issue: 20% of Lumira reference customers and 15% of those using SAP Analytics Cloud say that poor performance is a problem they have experienced. Performance is the primary limitation to wider deployment of SAP’s modern analytics and BI platforms, according to its reference customers (more so than any for other vendor evaluated). The combination of poor quality and performance can inhibit user adoption. It will not help advocates for the use of SAP’s analytics and BI software when competing internally with demands for other non-SAP platforms. It may also partially explain why 9% of SAP’s reference customers using Lumira are planning/considering reducing or discontinuing its use.
- Interoperability: SAP Analytics Cloud and Lumira have clear value propositions and, despite the product overlap, SAP customers may choose to use both products. However, there is very limited interoperability between the two. SAP’s roadmap for the next two years includes improved interoperability, with acceleration now in closing basic gaps in SAP Analytics Cloud. This development effort forms part of SAP’s ongoing initiative to simplify its broad set of BI technologies. This effort is evident in Lumira 2.1 becoming a replacement for BEx Web, and its integration with Enterprise BI via the new Fiori-style SAP BusinessObjects BI Launchpad.
- Customer support: Support quality remains a relative weakness for SAP in this Magic Quadrant. Reference customer scores for expertise, response time and the time it takes to resolve issues put SAP in the bottom quartile of the vendors evaluated for this capability.
SAS competes in the analytics and BI market as well as the data science and machine learning market, the latter segment being where the company’s origins lie. SAS Visual Analytics is the primary product offering and brings interactive discovery and dashboards as well as reporting and scheduling for mainstream business users. The advanced analytics capabilities within SAS Visual Analytics are geared toward citizen data scientists, with no need for coding or specialist statistical skills.
SAS Visual Analytics is available either in an on-premises deployment, or through the cloud in SAS’s own data centers or through third parties such as AWS. In March 2017, SAS released a major new version of the product running on a new architecture: SAS Visual Analytics on SAS Viya. SAS Viya is based on an in-memory, microservices architecture that allows for greater flexibility in navigating on-premises and cloud data stores and deployment models. In addition to running on a new platform, SAS Visual Analytics on SAS Viya includes a refreshed user interface based on HTML (earlier versions relied on Adobe Flash) and a streamlined workflow.
SAS Visual Statistics (outside the scope of this Magic Quadrant) is an add-on to SAS Visual Analytics that provides a GUI for citizen data scientists and data scientists to build and refine predictive models while exploring data within Visual Analytics.
SAS’s position on the Magic Quadrant as a Visionary is driven by its product roadmap, which also includes augmented analytics for unstructured data sources, robust vertical industry solutions built on top of SAS Visual Analytics and a strong global presence. Its position was hampered by low reference customer scores for operations, sales experience and customer experience.
- Comprehensive product for multiple use cases: SAS Visual Analytics’ scores resulted in an evaluation of “excellent to outstanding” for two of the critical capabilities that most drive today’s buying requirements: interactive visual exploration and analytic dashboards. Users can visually explore data while also performing analysis and data manipulation such as time period and variance calculations, binnings and groupings. The product supports advanced chart types, including correlation matrix and decision trees, out of the box. A range of advanced analytics, including multiple forecasting methods, clustering and decision trees, are native and menu-driven. In addition, R, Python, Java and Lua models are all now supported, as well as SAS models. The ability to ingest semistructured textual datasets and apply sentiment analysis is a competitive differentiator. Because SAS also enables the creation of formatted reports with scheduling, distribution and alerting to a range of output formats, it can be used for both Mode 1 and Mode 2 styles of analytics and BI.
- Stable and socially conscious: SAS is one of the largest privately held software vendors, with $3.24 billion in revenue in 2017. Its total company revenue grew a modest 1.25% in 2017. SAS does not specifically break out revenue growth for Visual Analytics, but cites more than 17,000 customers. SAS has some of the best diversity numbers in the industry. Beyond being worker-friendly, SAS has long had a social responsibility and education outreach program. SAS recently expanded its efforts to include the Data for Good movement, with a new mobile app, GatherIQ, which helps link projects and not-for-profit organizations with community experts.
- Prebuilt industry solutions: A large portion of SAS company revenue comes from its prebuilt industry or horizontal solutions, particularly for fraud, risk management, Internet of Things (IoT), and customer intelligence. These solutions are built on top of SAS Visual Analytics.
- Product vision: SAS’s product roadmap includes a number of key elements around augmented analytics that includes natural language query and generation using both text and voice. The product will include smart data preparation capabilities. Insights will be generated based on statistical relevance as well as usage and ratings by other users. The near-term roadmap also supports running the analysis where the data lives, for edge analytics.
- Few customers on SAS Viya: Improving ease of use and visual appeal were part of the design tenants in SAS Visual Analytics on SAS Viya. Based on Gartner’s assessment of the latest version, ease of use and visual appeal have improved. However, as few customers are currently using this version of the product, SAS is still placed in the bottom quartile of vendors in this Magic Quadrant for ease of use. Most reference customers are still using SAS Office Analytics and SAS Visual Analytics 7.4, or earlier. This is an important factor in SAS’s score for market understanding, because ease of use is a strong factor in driving current buying requirements.
- Cost of software: This has been a perennial limitation to wider deployment for SAS, with 50% of reference customers citing cost as a problem this year. A couple of years ago, SAS updated its pricing and packaging, but it has not kept pace with the downward pricing pressure in the current market. Also, the overall perception that SAS is high-priced — based on the cost of its prebuilt solutions, which command a premium — casts a shadow on SAS Visual Analytics. SAS has only ever had subscription-based pricing, which is now pervasive in the market, but that often has a higher cost beyond year three. This cost challenge can be further exacerbated by, or contribute to, SAS’s below-average ranking for sales experience.
- Migration challenges: SAS is evaluated as lowest of all Magic Quadrant vendors for migration experience, with 9% of its reference customers rating the experience as “poor to limited.” This has been a challenge for multiple versions over many years, so moving to a completely new platform with SAS Viya is likely to be even more challenging. The initial release of Visual Analytics 8.1 on SAS Viya was not really an upgrade, but a reimplementation. Initially, there were no migration tools for SAS Viya and it was primarily offered to new customers or new deployments. SAS expects that 90% of its existing SAS Visual Analytics content will migrate to the SAS Visual Analytics 8.2 on SAS Viya version, but this remains to be seen.
- Evolving cloud: SAS Viya is a cloud-first platform enabled by a microservices architecture. However, administration capabilities for hybrid cloud/on-premises deployments and for elastic scaling of workloads (from both a technical and licensing perspective) are a work in progress. In addition, SAS Visual Analytics currently lacks hybrid connectivity to on-premises data sources. Cloud is primarily run in SAS’s own data centers, although AWS is used in some world regions. Only a small percentage of reference customers are running SAS in a public cloud.
Sisense offers an integrated analytics and BI platform, covering the entire workflow — from data collection to storage — on a self-contained in-memory columnar database, with in-chip processing, visual data exploration, dashboards and recently added embedded advanced analytics features. The company is privately held. OEM and embedded analytics continues to be a strong use case for Sisense and accounts for more than half of the company’s revenue.
Sisense 7 was released in October 2017, with significant updates to the platform. An improved single stack, with an entirely browser-based interface and more integrated workflow, now supports enhanced data ingestion and mashup capabilities; also, an easier-to-use centralized ElastiCube manager, improved embedding features and integrated machine-learning capabilities. The product also gained better cloud-readiness and can leverage cloud services from Amazon, Microsoft and others, with configurations spanning from bring-your-own license to private cloud. Enhanced augmented analytics capabilities for insight detection and explanation are also on the product roadmap.
Sisense remains in the Visionaries quadrant. The company scored well on most aspects of Ability to Execute, with improved scores for product, market responsiveness and operations, in particular.
- Relevant product enhancements and a strong vision: Sisense 7 adds relevant enhancements to the product and closes some gaps. Better cloud and mobile support, improved self-service data preparation and ElastiCube manager, and enhanced integrated workflow on a browser-based solution are some of the highlights. In addition to these product improvements, reference customer scores place Sisense in the top quartile for ease use on content development, content consumption and visual appeal; arguably, the most important aspects of overall ease of use, a key selection factor. From an IT perspective, customer references also have a good outlook on the product, with top scores on ease of use for administration and migration experience. The vendor is also positioned in the top quartile for its vision on product strategy, which is supported by a solid development roadmap with innovative capabilities. As workforce diversity helps with innovation, it is noteworthy that more than 30% of leadership positions in Sisense are held by women, above the industry average.
- Strong OEM and embedded use case: Nearly 53% of Sisense’s reference customers leverage the product for the OEM and embedded use case — the highest of all vendors evaluated here. This position results from a strong and dedicated partner network that leverages the product to deliver customer-facing analytics and data monetization solutions, for example. The product also supports white labeling and Rest APIs to allow for embedding and extending out-of-the-box capabilities. The vendor also works with OEMs for appropriate pricing and packaging distinct to this use case; a go-to-market differentiator in the crowded modern market. In 2017, Sisense also launched its Sisense Startup accelerator program, to make the software free for a year for startups wanting to develop solutions with the product embedded. By the end of 2017, more than 700 organizations had registered for this program.
- Strong market momentum backed by solid customer experience and operations: Customer reference scores for Sisense position it in the top quartile of these vendors for customer experience and operations, including support, product quality and a top migration experience. This ability helps the company sustain a solid market momentum. The vendor’s total company revenue grew by an estimated 70% for 2017, with 1,000 customer deployments. Reference customer scores also placed Sisense in the top quartile for its ethics, culture and diversity, with 100% of its references evaluating the vendor as “excellent.”
- Enterprise standard in midsize organizations: Sisense is considered the enterprise standard by 77% of its customer references, the top result for vendors in this Magic Quadrant. However, a large portion (62%) of its reference customers are SMBs, more than for most vendors here. Instead of being a point solution for narrow-scope use cases sold on a departmental base, customers are adopting the solution as their primary and, in many situations, exclusive analytics tool used for a broad range of analytics. The range of capabilities offered through a unified web-based workflow help support this level of customer penetration.
- Deployment sizes: Although often the enterprise standard, the average size of Sisense’s deployments is just over 300 users. This places it in the bottom quartile of Magic Quadrant vendors and is most likely derived from its adoption in smaller organizations. Sisense has relevant customer references from large deployments, but organizations planning to expand beyond a few hundred users should speak to customer references that correspond to their scalability requirements.
- Small average data volumes: In assessing the volume of relational data analyzed, Sisense reference customers report among the smallest data volumes in this Magic Quadrant. Moreover, once data is loaded into Sisense’s in-chip engine, the majority of reference customers are analyzing less than 50GB within a single application. These results do not align with the company’s go-to-market differentiation of a product capable of handling very large, complex datasets and delivering high performance. Most deployments seem to be SMB-based, inherently less demanding, which typically don’t need to withstand analysis of large data volumes. Although poor performance is not identified as a significant problem by reference customers, organizations requiring analysis of significantly larger data volumes should invest in scalability tests before committing to the product.
- Product limitations: Sisense 7.0 (October 2017) addressed several limitations identified by customers, but product limitations remain. Interactive visualization capabilities — a key buying and adoption factor — were scored as only “good,” which is lower than many of Sisense’s direct competitors. In particular, there are gaps in the ability to visually define groups, and complex filtering via point-and-click as opposed to formulas. Also, augmented data discovery — a key capability for next-generation platforms — is only modestly represented in the platform — in areas such as data preparation. This is part of the current development roadmap.
- Limited geographic coverage: Sisense’s geographic coverage is limited, with support centers and operations in the U.S., Israel, and Kiev, Ukraine. Sisense then leverages its partner network (of more than 400), including resellers and OEM integrators, for other regions. This lack of direct regional presence can be a challenge for some potential customers.
Tableau offers an intuitive interactive visual-based exploration experience that allows business users and any content author to access, prepare, analyze and present findings in their data without technical skills or coding. Tableau offers three primary products: Tableau Desktop, Tableau Server and Tableau Online (its cloud offering). Tableau has always been committed to giving everyone across the enterprise the power to explore and find insights in data. It has disrupted the market with this approach, and through its land-and-expand sales model — which has become an expected feature of all modern analytics and BI platforms.
During the past year, Tableau has delivered on a number of promised enterprise features that appeal to an IT buyer — as part of a shift in strategy toward large enterprise deployments and sales. It released certified and recommended data sources for improved governance of large deployments; hybrid data support from the cloud; scheduling and alerting; an enhanced SDK and APIs; and added collaboration, to name a few. Tableau also progressed in better handing data at scale. Tableau’s new Hyper in-memory engine was released in January 2018, and its Project Maestro data preparation product is now in beta. In August 2017, Tableau acquired ClearGraph, to provide a natural-language interface, and is working on the integration for release in 2018.
Tableau is in the Leaders quadrant. Contributions to this position include its efforts to build product awareness globally; and its product roadmap, which includes NLP, augmented data preparation and discovery and agile data cataloging, among others. Strong market momentum in an increasingly competitive and price-sensitive market; ongoing product improvements; and excellent customer reference scores for customer experience and success also drive its position.
- “Gold standard” for interactive visual exploration: Tableau’s core product strengths continue to be its intuitive interactive visualization and exploration, and analytic dashboarding capabilities, for almost any data source — leveraging its extensive set of data connectors, with both in-memory and direct query access for larger datasets. This combination, which includes drag-and-drop advanced functions such as forecasting, clustering, automated geocoding and assisted formula editing, allows users to do deep exploration and manipulation of their data more easily and quickly than with most competing platforms. Tableau’s reference customers continue to purchase the product for its user experience, ease of use and functionality, for which its scores are among the highest of all the vendors evaluated here.
- Focus on customer experience and success: Customers continue to be happy with Tableau. Reference customer scores place all aspects of customer experience and operations in the top quartile of vendors in this Magic Quadrant. This includes a top quartile score for achievement of business benefits and the highest score for user enablement — both key success measures. Tableau skills are in high demand, and to support this Tableau offers a vast array of learning options together with Tableau Public, its online community, and its extensive network of Alliance Partners. Its well-attended annual user conference, which topped 14,000 attendees in 2017, is further evidence of user satisfaction and customer success. Tableau’s reference customer scores place it in the top quartile for ethics and culture, with 94% of reference customers scoring it as “excellent.”
- Expanding deployments and standardization rates: An increasing number of Tableau reference customers (at more than 55%, an above-average percentage) are using it to empower centralized teams to provision content for consumers in an agile and iterative manner. Other reference customers (64%) are using it to enable completely decentralized analysis by business users. Average deployment size continues to grow each year as organizations standardize on Tableau at higher rates and deploy it more broadly across the enterprise. Most of Tableau’s reference customers consider the platform to be either “one of” (36%) or “the” (52%) enterprise standard. Moreover, its reference customers report top quartile deployment sizes in terms of number of users compared with the other vendors evaluated.
- Flexible deployment options: Tableau can be deployed in the cloud, with Tableau Online, or on-premises. Tableau was early to the cloud, initially relying on deployment in its own data centers. Its cloud deployment options have evolved to also provide prepackaged virtual machines for AWS and Microsoft Azure, in order to simplify deployment. During the past year, it has added support for the Google Cloud platform as well as hybrid data support to on-premises sources from the cloud. Tableau Server is available as “bring your own license” on the Azure and AWS marketplaces; it is also available in pay-by-the-hour on AWS Marketplace.
- Market mainstreaming: Visual-based data exploration (Tableau’s primary disruptive capability) is, while still a differentiator, now being offered in some form by most players in this market. Downward pricing pressure from low-cost license options is affecting the competitive environment. This has caused increasingly competitive and contested expansion and enterprise deals as feature differentiation narrows, competitive options grow, and enterprise features and price versus value factor more in the purchasing decision than before. Although Tableau continues to attract new customers and expand deployment size, this intense competitive environment has contributed to Tableau’s slower revenue growth in recent years.
- Pricing and packaging: Software license cost, particularly as low-cost options grow and improve, continues to be a challenge for Tableau. One of Tableau’s few weak spots in its reference customer scores centers around license cost — with among the highest percentage of reference users citing cost as a limitation to broader deployment. Lower-priced market entrants are increasingly appealing to less analytically mature buyers, particularly for large enterprise deployments with a high percentage of consumer users. At the same time, many traditional BI incumbents are making their modern analytics and BI components available as part of existing maintenance contracts. Tableau has responded to this competitive pressure by moving to subscription pricing, offering enterprise pricing options and being more flexible in terms of discounting on large deals.
- Lack of complex data model support: Organizations are seeking insights from larger and more varied combinations of data, requiring more complex data models. While a broad range of data source connectivity options are supported by Tableau, complex data modeling such as multifact table models must be created either outside of Tableau in a data warehouse or via self-service data preparation partners. Both of these options will add to the TCO. Moreover, poor performance for large in-memory extracts often requires modeling in a separate data repository that is directly queried from Tableau. The new Hyper in-memory database is expected to materially improve performance on large in-memory data extracts. Tableau’s stand-alone self-service data preparation tool (code-named Project Maestro) is currently in beta, and is intended to make it easier for users to shape and harmonize large and complex data without resorting to another tool.
- Product vision: Tableau is investing in, but not driving, the next wave of disruptive innovation. The vast majority of its product roadmap investments are focused on closing the gaps in enterprise features and extensibility, flexible deployment including cloud, supporting larger and more complex governed datasets, and on making its visual exploration paradigm easier. Tableau has increased investment in NLQ and augmented analytics is on its roadmap. It is still, however, later to invest than the innovative startups and larger vendors in what will be a cornerstone of the future of modern analytics and BI market.
ThoughtSpot differentiates itself on its search-based interface for visual exploration at scale. First appearing in the 2017 Magic Quadrant, this vendor has continued to execute on its roadmap and the newly introduced SpotIQ brings augmented analytics to its product line. ThoughtSpot’s main differentiator is its search-based interface to visual exploration; several of the company’s founders come from Google.
The product is most often deployed as an on-premises appliance on commodity hardware, with data loaded in-memory and indexed for fast query performance. ThoughtSpot recently introduced free SaaS for up to five users and 10GB of data as part of its freemium model and for faster proofs of concept, but this option is not advertised on the company website. ThoughtSpot is often deployed in addition to other analytics and BI products; the aim being to use the familiarity of search to bring new classes of users to analytics and BI.
ThoughtSpot has moved from the Niche Players to the Visionaries quadrant, driven by an expanded roadmap and differentiated marketing, along with improvements in operations and viability.
- Complex search and performance at scale: With ThoughtSpot’s search-based visual data discovery, users can enter terms such as “sales by product for 2017” to generate a visualization. The search goes beyond a simple keyword to include analytical keywords such as “top” for rankings or “near” for geoanalytics, with all searches incorporating past searches to generate the most relevant results. Customers repeatedly describe this new interface and its ease as a way of bringing analytics and BI to new users and use cases. Reference customers for ThoughtSpot gave it high scores for overall ease of use, with 27% selecting the product primarily for its ease of use for content consumers (more than for any other vendor in this Magic Quadrant). The data models also support complex relationships such as multiple fact tables, chasm traps and fan traps, as of 2017. Fast performance at scale is supported by indexing all data, compiled queries, an in-memory massively parallel processing (MPP) columnar database, and a distributed cluster manager. One reference customer cited analyzing 20TB of data, with 27% of reference customers analyzing more than 1TB of in-memory data.
- Improved market momentum: ThoughtSpot’s customer base has grown three-fold year over year. As of December 2017, the company has 230 employees, a roughly 70% growth rate in head count. The vendor received an additional $60 million of venture capital funding in 1Q17, bringing the total to $160 million. This vendor has generated interest in a crowded market by hosting two marketing events in 2017, dubbed “analytics thought leadership” and involving notable speakers and key customers. The New York-based event sold out, with more than 300 attendees. Reference customers’ scores for ThoughtSpot’s overall diversity, ethics and culture were the highest of any vendor in this Magic Quadrant. The introduction of a free trial, launched in 2Q17 is also an important move for the vendor in engaging with new prospects.
- Innovative roadmap with augmented analytics: Augmented analytics poses the next wave of disruption in this market, and while ThoughtSpot version 4.0 had some elements of this, SpotIQ further delivers on the use of machine learning to automatically generate relevant insights. In addition, the vendor’s vision includes leveraging crowdsourced content and ratings to refine the most relevant insights. The current search-based analytics will expand to also include NLP with voice.
- Support and sales: ThoughtSpot’s reference customers gave it high scores for operations, placing it in the top quartile of the evaluated vendors. Operations includes support, time to resolve issues, and migration experience. ThoughtSpot also scored highly for sales experience, as it did last year.
- Data replication and less robust features: ThoughtSpot requires all data to be loaded into its in-memory engine. For customers with slow and disparate data sources, the in-memory appliance provides a benefit; however, for customers that have already invested in high-performing analytic databases (such as Amazon Redshift or SAP HANA) it undermines their investment. Interactive visual exploration with data manipulation is a key buying requirement and an area in which the product currently lags. Similarly, the degree of geographic analytics, offline mobile capabilities and extensive formatting options are all product gaps for ThoughtSpot. Row-level security could be better implemented and is mainly via a formula. Fifteen percent of customer references for ThoughtSpot cited absent or weak functionality as a barrier to deployment (placing it in the top quartile of vendors for this complaint).
- Skilled resources and user enablement: As a relatively small vendor in this space, skilled resources for ThoughtSpot are less available and customers must rely on the vendor for such expertise. While customer references give high scores for the availability and expertise of vendor resources, skills in the marketplace are rated in the bottom quartile. The company has established periodic customer advisory boards, but otherwise lacks a user conference at which to share best practices. Documentation does not keep pace with the product releases, which is an issue for user enablement and when customers have less access to broad support in the marketplace.
- Geographic reach and ecosystem: ThoughtSpot lacks the geographic presence of its larger competitors. Support centers exist in the U.S. and, newly, in the U.K. and India. There are no vertical industry templates. ThoughtSpot introduced a new channel program in 2017, but thus far all implementations have been handled directly by the vendor. The lack of a strong partner network further inhibits the building out of vertical content.
- Land-and-expand pricing: Much buying in analytics and BI is now driven by business users, trials often starting with a grassroots approach by individual users. These users will download free trial copies of the software or sign on to free SaaS options in order to trial the software. They then help grow adoption and push for a broader proof of concept within an organization. ThoughtSpot, as a server-based solution, has historically lacked the option of a free individual trial. A SaaS option was introduced in 4Q17, which may help address this barrier. ThoughtSpot does not publish the starting price of its software.
TIBCO Spotfire was an early visual-based data discovery disruptor that helped transform the market from traditional reporting to modern analytics and BI. Spotfire offers extensive capabilities for analytics dashboards, interactive visualization and data preparation in a single design tool and workflow, while offering flexible processing options either in-memory or in-database. TIBCO has continued to expand its feature set to include data science, machine learning and streaming analytics, location intelligence, data cataloging and, most recently, data virtualization through acquisition, OEM relationships and integration with TIBCO middleware.
During the past year, TIBCO has continued to invest in sales and marketing and land-and-expand initiatives in order to generate renewed market awareness as well as expansion in its installed base. It also delivered product enhancements in advanced analytics for citizen data scientists, automated insight recommendations, and augmented data preparation, to name a few.
TIBCO Software is positioned in the Visionaries quadrant due to its global presence, clear differentiation for advanced exploration and its augmented and streaming analytics product vision. Strong product scores, particularly for a decentralized use case, have contributed to TIBCO’s Ability to Execute. However, lower customer reference scores for customer and sales experience and operations, coupled with tepid mind share and new buying momentum relative to the Leaders, has detracted from this strength.
- Well-suited for advanced data exploration: Spotfire reference customers select and use the platform for conducting advanced and complex analysis. They score it in the top quartile for this metric, although it can be used for a range of use cases. The platform features an integrated, and increasingly machine-learning-based, automated self-service data preparation capability for building complex data models within a unified design environment for interactive visualization and for building analytic dashboards. As part of that environment, analysts and citizen data scientists have access to an extensive library of embedded advanced analytic functions. This library of functions includes geospatial algorithms and geocoded data; many are drag-and-drop, with some newly added automated insight features and recommended visualizations for selected variables.
- Integrated data science and machine learning: TIBCO’s advanced data exploration is extended by integrated access to its data science runtime engine for the R analytic language, TIBCO Enterprise Runtime for R (TERR). TIBCO’s recent acquisitions of Statistica and Alpine Labs further deepen the options to extend the Spotfire platform with advanced analytics. A higher percentage of TIBCO Spotfire’s reference customers said they selected Spotfire for its advanced analytics/data science integration than for almost any other vendor in this Magic Quadrant.
- Entrenched global customer base of advanced users: Spotfire appeals to users with a range of skills levels, but it has particular appeal to advanced users such as scientists and engineers. These types of users have entrenched Spotfire in many large global organizations across many industries. In particular, it is used in life sciences (through partnership with PerkinElmer) and in engineering departments in the oil and gas, retail and consumer packaged goods, and utilities industries, as well as in manufacturing domains. Seventy-three percent of Spotfire reference customers said they use it for decentralized deployments, which puts it in the top quartile compared with other vendors’ products in this Magic Quadrant. These large, mature and mostly decentralized deployments have contributed to Spotfire’s above-average score for deployment size (by number of users).
- Product vision: Our evaluation of TIBCO’s Completeness of Vision is in part due to its ability to leverage its range of acquisitions and partnerships in order to construct a forward-looking roadmap for Spotfire. It has invested early in augmented data preparation and discovery, and in agile data cataloging via an OEM relationship. This enables users to publish and share harmonized datasets in an optional access data catalog that is searchable by other users — although seamless workflow between the two products is a work in progress. TIBCO is also relatively early in delivering streaming, operational and IoT analytics, through integrations with TIBCO middleware and other acquisitions (StreamBase and LogLogic).
- Sales experience and pricing: TIBCO’s reference customer scores for sales experience place it in the bottom quartile this year. Like last year, one in five references cite license cost as a barrier to their broader deployment of Spotfire. In a crowded market, the downward pricing pressure affecting the entire market will continue have an impact on expansion contracts and renewals for Spotfire. In addition to license price concerns, Gartner inquiries suggest that in 2017 there were some instances where TIBCO aggressively audited customers across product areas. This can intensify a negative customer perception, which could have contributed to the erosion of a previous improvement in this area. Moreover, while TIBCO’s partnership with Perkin Elmer seems to drive good Spotfire business, there appears to be some channel confusion among life sciences customers, particularly during the sales and renewal process.
- Product areas for improvement: TIBCO’s overall product score places it in the top quartile, but a few gaps exist. Self-service data preparation offers limited impact analysis and certification of data sources, and has some gaps in advanced data inference and profiling. Responsive design is not fully automated and there is no support for offline exploration. While TIBCO is investing in a cloud-first strategy, and its cloud capabilities are rated as only “good,” its competitors are scored higher. Publish, share and collaborate features were also scored as “good” overall, but there was a lack of storytelling features, user content ratings and no integration with social platforms other than TIBCO’s tibbr — which serves as its collaboration environment. Overall, TIBCO may face integration challenges, from its many acquisitions during the past year, which can divert resources away from investment in innovation.
- Less intuitive content authoring for casual business users: Spotfire is less intuitive than some competing products for new users, and for casual business users who simply want to assemble lightweight dashboards and analysis. Reference customers rated Spotfire as below average in this Magic Quadrant for content authoring and for visual appeal, and their evaluation placed it in the bottom quartile for ease of administration. In addition, 11% of reference customers cite a lack of ease of use for content authors as a limitation to broader deployment, putting TIBCO in the top quartile of these vendors for this limitation.
- Customer experience and migration: On user enablement, reference customers scored TIBCO as slightly below average for user conferences, documentation, online communities and online tutorials. These capabilities play an important role in a customer’s success with the product, which is a necessary precursor to broader market adoption. These capabilities are also important drivers for achieving business benefits from the platform; an area where TIBCO scored slightly below the survey average (according to its reference customers). Availability of skills from the vendor and the market are both scored as slightly below average. Reference customers evaluated their migration experience in the bottom quartile of vendors in this Magic Quadrant.
Yellowfin delivers a single, web-based analytics and BI platform that supports both Mode 1-style analytics and BI with scheduled reports and alerting, as well as Mode 2 with innovative features that include collaboration and emerging augmented analytics capabilities. With robust APIs and an established business model to support its OEM customers, this channel accounts for approximately 30% of the company’s total revenue.
Yellowfin 7.4 was released in October and includes the ability to automatically generate insights using machine learning and with NLG. There is also a new lightweight data preparation capability that can be used as an ETL tool with output to database tables. It also can integrate with external data and inputs such as R scripts and Python.
Yellowfin remains in the Niche Players quadrant and is well-suited to agile, centralized BI provisioning. It is focused on Asia/Pacific, with expanding support in Europe and the U.S. and for SMBs. Despite some visionary elements in its product roadmap (such as a marketplace and augmented analytics), it has been a quick follower rather than a vendor that others imitate. Yellowfin scores relatively low for market understanding and, with less funding than other competitors, has less potential for disrupting the market. Below-average scores for customer experience and operations also contributed to its low position on the Ability to Execute axis.
- Agile, central BI and visual appeal: Yellowfin is a fully web-based product that provides interactive visualization, dashboards, threaded discussion scheduling, and alerting, in a single platform. The product has some relatively unique capabilities, such as a workflow tracker for collaborative development, and version 7.4 introduces the ability to automatically generate insights — the beginning of this vendor’s augmented analytics capabilities. Users also can rate and discuss content, with a time-stamped visualization embedded within the discussion; a relatively unique feature among the vendors for this Magic Quadrant.
- Clear and attractive pricing: More reference customers choose Yellowfin for its low licensing costs than any other vendor in this Magic Quadrant. Historically, Yellowfin BI has been sold at a named-user subscription price of $600 per user per year, but in 2017 it introduced three new licensing models: Community, Professional and Enterprise. The Community version supports three users for free running in various clouds (such as AWS, Google, Microsoft Azure). The Professional license starts at five users for $1,750, running on as many server cores as the client chooses. The Enterprise version starts at 50 users, with rights to deploy in development as well as production (pricing is not provided). The list price for Professional then lowers the per-user price to $350 per user per year (compared with the previous list price of $600). This is not as low as Microsoft Power BI, but substantially lower than the offerings of many vendors in this report. Yellowfin also offers extranet pricing for anonymous logons. The simple and clear pricing and packaging in part contributes to a positive sales experience, which has improved year over year and is now slightly above average for this Magic Quadrant.
- Diversity and ethics: In 2017, Yellowfin launched a number of initiatives to broaden its community outreach, with multiple events intended to raise the awareness and interest of girls in Science, Technology, Engineering and Mathematics (STEM) programs. The vendor has also been providing free software for academic use since 2013, with additional discounting for not-for-profit organizations. Reference customers for Yellowfin scored its ethics, diversity and culture as above average. Yellowfin’s self-reported diversity numbers place it above the tech industry average for the percentage of women holding leadership (30%) and technical positions (35%). Yellowfin’s reference customers scored its ethics, culture and diversity above average for this Magic Quadrant, with 91% considering it “excellent.”
- Cloud ready: In 2017, Yellowfin launched its own cloud capabilities, running on AWS and NTT Communications. Prior to this, customers could run Yellowfin in their own private clouds; the product is multitenant. Nearly 70% of Yellowfin’s customer references report using the product through either a public or private cloud; more than for most other vendors in the Magic Quadrant (except the cloud-only specialists).
- Scalability concerns: Yellowfin’s origins lie mainly in enabling and improving the user experience for traditional BI users wanting reports and dashboards from data primarily in relational databases. Its portfolio and capabilities have evolved to support more sophisticated analytics with some in-memory capabilities, but its deployment sizes remain in the smaller ranges — in the bottom quartile of Magic Quadrant vendors for data volumes. Nine percent of Yellowfin’s reference customers cited the inability to handle the necessary data volumes as a platform problem, which is above the survey average. Though few reference customers cited user scale as a concern, the average number of users per deployment is slightly below average for this Magic Quadrant.
- Low market awareness and momentum: As a privately held vendor with no venture capital funding, Yellowfin does not benefit from some of the free press that publicly held vendors enjoy. The company claims it added 15,000 new logos in 2017, with a 20% growth in revenue; this is below the forecast growth of the market (at 28% for 2017) and lower than the growth of Yellowfin’s smaller competitors. Its social media following on Twitter and LinkedIn remains low relative to leading competing vendors in this Magic Quadrant.
- Operations and upgrade challenges: Yellowfin had low operations scores from its reference customers this year; primarily due to product quality, which ranked in the bottom quartile, and upgrade problems from version 7.2 to 7.3. Version 7.4 was released too recently for us to assess the upgrade process this time.
- Market understanding: Market understanding is an important aspect in a vendor’s Completeness of Vision rating, and one where Yellowfin ranks in the bottom quartile. In large part, this is due to the product’s limited breadth of use by its reference customers (51% of customers use it primarily for parameterized dashboards). To date, data preparation was mainly completed outside the platform, although this may change with version 7.4. While the vendor’s overall ease of use is considered excellent by 68% of its reference customers, it still is below average for this Magic Quadrant.
Vendors Added and Dropped
We review and adjust our inclusion criteria for Magic Quadrants as markets change. As a result of these adjustments, the mix of vendors in any Magic Quadrant may change over time. A vendor’s appearance in a Magic Quadrant one year and not the next does not necessarily indicate that we have changed our opinion of that vendor. It may be a reflection of a change in the market and, therefore, changed evaluation criteria, or of a change of focus by that vendor.
It is important to note that a vendor’s exclusion this year does not mean that they will not be included in future years and vice versa.
Looker was added to the Magic Quadrant this year.
- Alteryx was excluded based on Gartner analyst opinion formed in inquiries, customer reference checks, reference surveys, and industry events that they primarily complement rather than compete with other vendors in this Magic Quadrant. They are included in the Magic Quadrant for Data Science and Machine Learning, as well as in the Market Guide for Self-Service Data Preparation.
- ClearStory Data and Zoomdata were excluded because they did not meet one or more of the inclusion criteria for this year’s Magic Quadrant.
- Datameer and Pentaho were excluded because they shifted their market emphasis.
Inclusion and Exclusion Criteria
This year’s Magic Quadrant includes 20 vendors that met all our inclusion criteria, as listed below.
Modern Analytics and BI Platform Assessment
This was evaluated by Gartner analysts and was determined by the extent of IT involvement that is considered to be mandatory before the platform can be used by a business analyst/information worker to analyze data, without IT assistance. Products that did not meet the criteria of a modern analytics and BI platform are those requiring significant IT involvement — either internal or external to the platform — in order to load and model data, create a semantic layer or build data structures as a prerequisite to using the BI platform. IT developer-centric platforms focused on custom coding analytic applications were, likewise, not evaluated for inclusion. Products that met the modern criteria were evaluated for inclusion in the Magic Quadrant based on a funnel methodology where the requirements for each tier must be met in order to progress to the next tier. Tiers 1 to 3 are evaluated at the vendor level; Tiers 4 and 5 are evaluated at the product level.
- Tier 1. Market Presence — A composite metric assessing both the interest of Gartner’s client base and that of the broader market, through internet search volume, job postings and trend analysis, and social media presence, was conducted for each vendor. Note: Vendors were considered for inclusion based on data as of August 2017. Vendors that are considered complementary to analytics and BI platforms are also excluded.
- Tier 2. Revenue* — For those vendors meeting the market presence criteria (Tier 1), analytics and BI revenue for each vendor was assessed and evaluated. For this assessment, two common license models were assessed and revenue from each was combined (if applicable) and evaluated against the three revenue inclusion levels (shown below) for qualification:
- Perpetual License Model — Software license, maintenance and upgrade revenue (excluding hardware and services) for calendar years 2015, 2016 and 2017 (estimated).
- SaaS Subscription Model — Annual contract value (ACV) for year-ends 2015, 2016 and projected ACV for year-end 2017, excluding any services included in annual contract. For multiyear contracts, only the contract value for the first 12 months should be used for this calculation.
- Revenue inclusion levels are as follows:
- $25 million 2017 (estimated) combined perpetual license revenue + 2017 (estimated) ACV, or
- $15 million 2017 (estimated) combined perpetual license revenue + 2017 (estimated) ACV with 50% year-over-year growth, or
- $10 million 2017 (estimated) combined perpetual license revenue + 2017 (estimated) ACV with 100% year-over-year growth
- * Gartner defines total software revenue as revenue that is generated from appliances, new licenses, updates, subscriptions and hosting, technical support and maintenance. Professional services revenue and hardware revenue are not included in total software revenue (see “Market Share Analysis: Analytics and BI Software, 2016” ).
- Tier 3. Magic Quadrant Evaluation Inputs — Full participation in the Magic Quadrant process requires the following input:
- Completing and providing documentation for an RFP-style questionnaire of detailed critical capabilities.
- Completing an online questionnaire around market presence, growth, go-to-market strategy and differentiation.
- Submission of a video up to one-hour long that demonstrates how included products deliver on the predefined analytic scenarios defined by Gartner (we only look at the first hour; anything beyond that is not considered).
- Verification of final analytics and BI revenue for 2015, 2016 and 2017 (estimated).
- Providing references for an online customer and OEM survey.
- Providing a vendor briefing to the Magic Quadrant authors.
- Providing access to evaluation software.
- Providing factual review of sections in the Magic Quadrant research.
- If a vendor declines to participate and does not respond to requests for supplemental information, Gartner’s analysis is based on other credible sources, including previous vendor briefings, customer inquiries, Peer Insight reviews and other publicly available information.
- Tier 4. Breadth of Coverage — The vendor must demonstrate breadth across vertical industries and geographic regions, as specified by Gartner.
- Tier 5. Product Assessment — Products that progressed to this final tier were assessed by Gartner analysts using the information provided by each vendor in the data collection exercise outlined above. The final step involved narrowing down the field to 20 vendors for inclusion in the Magic Quadrant.
Note: Gartner has full discretion to include a vendor on the Magic Quadrant regardless of its level of participation in the Magic Quadrant process, if the vendor is deemed important to the market. This discretion was not applied this year, because all vendors fully participated in the process.
Ability to Execute
Vendors are judged on Gartner’s view of their ability and their success in making their vision a market reality that customers believe is differentiated and are prepared to buy into. Delivering a positive customer experience — including sales experience, support, product quality, user enablement, availability of skills and ease of upgrade/migration — also determines a vendor’s Ability to Execute.
In addition to the opinions of Gartner’s analysts, the ratings and commentary in this report are based on a number of sources:
- Customers’ perceptions of each vendor’s strengths and challenges, as gleaned from their analytics and BI-related inquiries with Gartner
- An online survey of vendors’ customer references
- A questionnaire completed by the vendors
- Vendors’ briefings, including product demonstrations, strategy and operations
- An extensive RFP questionnaire inquiring how each vendor delivers the specific features that make up our 15 critical capabilities for this market (see “Toolkit: BI and Analytics Platform RFP” )
- A prepared video demonstration of how well vendors’ analytics and BI platforms address the 15 critical capabilities
- Analyst access to evaluation software
Ability to Execute Criteria
* Note: These criteria are scored partly or wholly on the basis of input from the Magic Quadrant customer reference survey.
- Product/Service*: How competitive and successful are the 15 product capabilities offered by the vendor in this market? How integrated is the workflow of the product? How easy to use and visually appealing?
- Overall Viability: What is the likelihood of the vendor continuing to invest in products and services for its customers and how do references rate the vendor’s relevance in the future? This includes an analyst assessment of the overall organization’s financial health, the financial and practical success of the business unit and the likelihood of the business unit continuing to invest in and offer the product and advance innovation within its product portfolio.
- Sales Execution*: This covers the vendor’s capabilities in all presales activities and the structure that supports them. It also includes deal management, pricing, negotiation and contracting, presales support and the overall effectiveness of the sales channel.
- Market Responsiveness and Track Record*: Does the vendor have momentum and success in the current market and is this momentum broad or confined to one geographic region? How diverse is the company’s workforce and how do customers rate its ethics, culture and diversity?
- Customer Experience*: How well does the vendor enable its customers through the availability of training, online tutorials, documentation and conferences, and how available are skilled resources (both in the market and from the vendor) with expertise in its product offerings? This also covers the extent to which customers realize tangible business benefits through use of the vendor’s software.
- Operations*: How well does the vendor support its customers? How trouble-free is the software, and how easy is it to migrate to a newer version?
|Table 1. Ability to Execute Evaluation Criteria|
|Evaluation Criteria |
|Product or Service |
|Overall Viability |
|Sales Execution/Pricing |
|Market Responsiveness/Record |
|Marketing Execution |
|Customer Experience |
|Table 1. Ability to Execute Evaluation Criteria|
|Evaluation Criteria ||Weighting |
|Product or Service||High|
|Marketing Execution||Not Rated|
Source: Gartner (February 2018)
Completeness of Vision
Vendors are rated on Gartner’s view of their understanding of how market forces can be exploited to create value for customers and opportunity for themselves. The Completeness of Vision assessments and commentary in this report are based on the same sources described in the Ability to Execute section.
When determining Completeness of Vision for the Offering (Product) Strategy criterion, Gartner evaluated vendors’ ability to support the key trends that will drive business value in 2018 and beyond. Existing and planned products and functions that contribute to the above trends were factored in to each vendor’s score for the Offering (Product) Strategy criterion (listed below for Completeness of Vision). These key themes (by category) are as follows:
- Support for a marketplace (buyers and sellers) where organizations, customers and partners can buy and sell custom-built analytic applications, aggregated data sources, custom visualizations and algorithms that integrate with the analytics and BI platform is beginning to form in the market, but is limited to a subset of vendors.
- Native access to a range of Hadoop, Spark, other NoSQL data sources, graph databases or search databases such as Elasticsearch and Kibana, Attivio or Splunk is becoming increasingly important as data grows in both volume and complexity.
- Support for hybrid deployments across on-premises and multiple clouds. This ranges from hybrid data support for querying on-premises data from the cloud, without first moving it to the cloud, to fully integrated, seamless, hybrid on-premises and multicloud deployments with a single point of administration, consumption, content authoring and licensing.
- A curated agile data catalog where business users can search, access, find and rate certified internal data as well as open and premium external data with workflow — in order to promote harmonized data to certified status. This is becoming crucial to governed modern deployments that leverage complex distributed data with an increasing number of distributed content authors.
- Data harmonization and affinity analysis of additional datasets that improve the analysis. These datasets should be recommended automatically using machine learning, and may extend to external data sources.
- Augmented data preparation on multistructured data is a core visionary feature in this category. The need to profile, enrich and infer relationships (to automatically generate a model for analysis), and to make recommendations to improve or enhance insights from the data, will be an area of innovation that will differentiate vendors in the future.
- The ability to automatically promote user-generated models and content to the SOR and reuse and build on existing variables, calculations, models and content is critical to large-scale trusted self-service.
- Modern push-down processing to big data sources, automating the selection of where to best process a query, is an important feature in supporting large and complex datasets by leveraging big data processing and minimizing the need to move data.
- Support for preparing, harmonizing and leveraging real-time events and streaming data, and pushing real-time results to a consumption layer in support of a range of use cases, is in its infancy. However, it will become an increasingly important data management consideration for organizations to adopt and integrate into analytic solutions in order to enhance their value to the business.
Analysis and Content Creation
- Augmented data discovery that automates the identification of the patterns, anomalies and clusters hidden in data that are often missed by analysts manually exploring datasets, is core to next-generation analytics and BI platforms. The automated identification of insights and findings are key to enabling and expanding access to analytics to more users within the organization and to speeding the time to insight while reducing bias.
- Search and NLP for voice and text — to support the concept of a personal analytics assistant that can generate NLQs and explain its findings to users using NLG — will be a dominant future interface for analytics.
- Conversational chatbots change how users interact with data from what is currently mainly drag-and-drop elements onto a page, to more a natural-language process that is supported by voice and conversations. As both a query mechanism and interpretation of results, conversational analytics represents the convergence of a number of technologies including personal digital assistants, mobile, bots and machine learning.
- Virtual and augmented reality is still largely in the concept stage and includes the ability to digitize images for data input as well as integrating virtual reality viewing devices with data and dashboards.
- Support for a broad range of content analytics and text analytics against unstructured data as organizations explore new sources of information to link to, and relate to, the analytical insights derived from structured data sources.
Sharing of Findings
- The ability to invoke business actions from within the platform either in a dashboard or embedded in another application represents a level of sophistication beyond current mainstream support for conditional alerts and event triggering based on system events.
- Decision management that provides a closed loop collaborative workflow for capturing the insights, actions taken, and reasons for a particular decision.
- Crowdsourcing and contextual recommendations for relevant content — based on insight gained from collaboration and social interaction by users — will largely replace the need to manually share content and findings across the organization.
- Integrated point-and-click simulation, what-if analysis and optimization extend the types of analysis users need that are today often created using custom calculations.
- Increasingly, organizations need to render analytics content in immersive experiences for different types of users across many touchscreens in boardrooms and operations centers.
Completeness of Vision Criteria
* Note: These criteria are scored partly or wholly on the basis of input from the Magic Quadrant customer reference survey.
- Market Understanding*: Does the vendor have the ability to understand buyers’ needs and to translate those needs into products and services? Ease of use, ability to support complex data requirements, and the types and complexity of analysis users conduct with the platform — all key buying criteria — factor into this rating.
- Marketing Strategy: Does the vendor have a clear set of messages that communicate its value and differentiation in the market? Is the vendor generating differentiated awareness? Are there data-for-good initiatives and social responsibility programs as part of the company’s overall differentiation?
- Sales Strategy*: Does the vendor have an innovative partner strategy, attractive pricing, flexible and clear product packaging, and a strong land-and-expand and enterprise sales model?
- Offering (Product) Strategy: Does the vendor’s approach to product development and delivery emphasize differentiation and functionality that map to current and future requirements, based on the product vision criteria summarized as key trends at the beginning of the Completeness of Vision section?
- Vertical/Industry Strategy: How well can the vendor meet the needs of various industries, such as financial services, life sciences, manufacturing and retail?
- Innovation: Is the vendor focusing its resources, expertise or capital to address key market requirements for competitive advantage? Is the vendor investing in and delivering truly unique and in-demand capabilities? Is the vendor setting standards for innovation that others try to match?
- Geographic Strategy: How well can the vendor meet the needs of locations outside its native country, either directly or through partners?
|Table 2. Completeness of Vision Evaluation Criteria|
|Evaluation Criteria |
|Market Understanding |
|Marketing Strategy |
|Sales Strategy |
|Offering (Product) Strategy |
|Business Model |
|Vertical/Industry Strategy |
|Geographic Strategy |
|Table 2. Completeness of Vision Evaluation Criteria|
|Evaluation Criteria ||Weighting |
|Offering (Product) Strategy||High|
|Business Model||Not Rated|
Source: Gartner (February 2018)
Leaders are vendors that demonstrate a solid understanding of the product capabilities and commitment to customer success that buyers demand in the current market. This is coupled with an easily understandable and attractive pricing model that supports proof of value, incremental purchases and enterprise scale. In the modern analytics and BI platform market, buying decisions are being made by, or at least heavily influenced by, business users that demand easy-to-use and easy-to-buy products. They require that these products deliver clear business value and enable powerful analytics with limited technical expertise and without the requirement for upfront involvement from IT. In a rapidly evolving market with constant innovation, a Leader must also demonstrate that it is not focused only on current execution. It must have a robust roadmap for solidifying its position as a future market leader, thus protecting the investment of today’s buyers.
Summary of Leaders’ Quadrant Positions
Consistent with any maturing technology market, net new buying of modern analytics and BI platforms is now mainstream. Organizations that have been successful with smaller deployments have now expanded their use across the enterprise and are increasingly making the modern analytics and BI platform one of if not the enterprise standard in their organization. Agility and ease of use for business users are still critical buying drivers. However, the ability to govern deployments, promote user-generated content to trusted enterprise sources, deal with complex large datasets, extend and embed analytic content, and support large global deployments have taken on new importance in the buying decision. There are currently three vendors executing sufficiently well on their vision to warrant a positon in the Leaders quadrant.
Challengers are well-positioned to succeed in the market. However, they may be limited to specific use cases, technical environments or application domains. Their vision may be hampered by the lack of a coordinated strategy across the various products in their platform portfolios. Alternatively, they may lack the marketing efforts, sales channel, geographic presence, industry-specific content and awareness of the vendors in the Leaders quadrant.
Summary of Challengers’ Quadrant Positions
This year, there is only one vendor executing at a level that may challenge the market leaders. However, having a more narrow focus on its existing installed base and a less disruptive product roadmap than both the Leaders and Visionaries, it risks continued competition and disruption from more broadly focused competitors.
Visionaries have a strong and unique vision for delivering a modern analytics and BI platform. They offer depth of functionality in the areas they address; however, they may have gaps relating to broader functionality requirements or lower scores on customer experiences, operations, and sales execution. Visionaries are thought leaders and innovators, but they may be lacking in scale or there may be concerns about their ability to grow and still provide consistent execution.
Summary of Visionaries’ Quadrant Positions
There are two main sets of vendors in the Visionaries quadrant, separated largely by their Ability to Execute. The first set of vendors provides a modern product offering backed by an established customer base, but they have emerging or hampered momentum. The second set of vendors provides an innovative and potentially disruptive product vision, but they have either gaps in their current offerings or a lack of visibility and traction in the current market (or a combination of both).
Niche Players do well in a specific segment of the analytics and BI market — such as cloud BI, customer-facing analytics, agile reporting and dashboarding, embeddability or big data analytics — or have a limited capability to out-innovate or outperform other vendors. They may focus on a specific domain or aspect of BI, but are likely to lack depth of functionality elsewhere. They may also have gaps relating to broader platform functionality, or have less-than-stellar customer feedback. Alternatively, Niche Players may have a reasonably broad BI platform, but limited implementation and support capabilities or relatively limited customer bases (such as in a specific geography or industry). In addition, they may not yet have achieved the necessary scale to solidify their market positions.
Summary of Niche Players’ Quadrant Positions
Almost half of the vendors in this Magic Quadrant are included in the Niche Players quadrant this year. All nine vendors represented in the Niche Players quadrant have specialized strengths and differentiated capabilities that position them well to meet the rapidly evolving customer requirements of this market.
Readers should not use this Magic Quadrant in isolation as a tool for vendor selection. In 2016, Gartner dramatically modified and modernized the underlying analytics and BI platform definition in order to reflect the segment of the overall market where the majority of active net new buying is taking place. As a result of this change, historical comparison with Magic Quadrants prior to 2016 (to assess vendor movement) is irrelevant and strongly discouraged.
This Magic Quadrant is an assessment of a vendor capabilities based on past execution in 2017 and future development plans, but may only be valid at a particular point in time as vendors and the market evolve. When making specific tool selection decisions, use it in combination with our Market Guide for traditional enterprise reporting platforms, Critical Capabilities, Survey Analysis research, and Strength, Weakness, Opportunity and Threat (SWOT) analysis publications, as well as our analyst inquiry service. Moreover, readers should be careful not to ascribe their own definitions of Completeness of Vision or Ability to Execute to this Magic Quadrant, which they often incorrectly map narrowly to product vision and market share, respectively. The Magic Quadrant methodology factors in a range of criteria in determining position, as shown by the extensive Evaluation Criteria section.
Overall, the analytics and BI market grew an estimated 10% (adjusted for constant currency) in 2017, with an expected growth of 8% through 2021 — as reflected in Gartner’s current estimate of the compound annual growth rate for the sector (see “Forecast: Enterprise Software Markets, Worldwide, 2014-2021, 4Q17 Update” ). The modern subsegment of the analytics and BI market segment continues to expand much more rapidly than the overall market, showing an estimated 28% growth in 2017, which will decelerate to 17% (in constant currency) by 2021. Customers are currently expanding their deployments for users and content, but downward pricing pressure and a certain saturation point will contribute to this deceleration.
Key trends in this year’s Magic Quadrant include:
- Augmented analytics. The 2017 Magic Quadrant showed that many megavendors late to visual-based data discovery disruption were early to the third wave of disruption in the form of augmented analytics. Augmented analytics includes machine-learning-enabled analytics and BI during all phases of the analytics workflow, from data preparation to data modeling to insight generation. The interaction also moves from primarily drag-and-drop query building to more voice, search and NLP-based. Best practices in visualizations are enhanced with NLG to explain findings. Megavendors and startups alike have all executed on their augmented analytics roadmaps. While this does not yet reflect mainstream buying, it is a proof point for customers that vendors are innovating at a rapid pace (see “Augmented Analytics Is the Future of Data and Analytics” ). In addition, customers that were early to adopt visual-based discovery are now facing the proliferation of data and user-created analyses. As they mature their analytics, these same customers seem most receptive to this next wave of disruption, which has the potential to help users find the most important insights more quickly, particularly as data complexity grows.
- Data scalability and model complexity. Data scalability and model complexity of analytics and BI platforms are under increased pressure as data storage options shift from single, relational storage to more varied, no SQL data storage. Also, as data literacy improves, users are asking more sophisticated questions — with multiple data sources and menu-driven predictive analytics. The rise in data lakes, as part of the overall information architecture, forces analytics and BI teams to decide how best to model the data and where. Should data be replicated into the in-memory engine of the analytics and BI tool? Are the downsides associated with data replication worth the benefit of performance? Products continue to differ substantially in these capabilities. Although some have made scalability and model complexity their hallmarks (MicroStrategy, for example), other vendors are working to address these shifts: Tableau acquired Hyper and is using that to replace the data extract model; Oracle now includes Essbase in its Oracle Analytics Cloud; and Pyramid Analytics has completely rearchitected its product to address these requirements.
- Embedded and enabling a community. The embedded analytics and BI market remains an important use case, because customers wish to create extranet applications, monetize data, and provide analytics and BI as part of an overall business application. The size of this market is difficult to quantify, because the primary development approach is often one of custom development. Of the vendors in this Magic Quadrant most often used for this use case: Logi Analytics has redoubled its efforts in this segment; it also remains a large portion of Sisense customers’ use case. The embedded use case also applies to analytics and BI vendors embedding content within an operational process and their own business applications in order to reach frontline decision makers, which is a differentiator for Salesforce. Qlik is aggressively pursuing this segment in trying to appeal to the developer community, but only a small portion of its reference customers report deploying in this use case. However, vendors are also trying to ensure that their platforms are now more open, as a way of enabling the community to extend capabilities that are not out-of-the-box. Such extensions can be in the form of visualizations, algorithms, calculations and prebuilt analytical apps.
- Social responsibility. Many of the evaluated vendors have long had special discounts for not-for-profit organizations, with free software programs for students both as part of social responsibility initiatives and as a way of seeding the market at university level. In addition, the move toward social responsibility has expanded as several vendors work to reduce their carbon footprint, publicly disclosing progress and pledging to donate a proportion of their profits back to the community. Data for Good, in which vendors host hackathons on public datasets and societal problems, or donate software and services to charitable initiatives, has become a way of giving back to society. It has also become a way of differentiating when recruiting from a tight talent pool. This year’s Magic Quadrant is the first to include evaluation of a vendor’s social responsibility initiatives and diversity metrics.
- Downward pricing pressure, subscription-based pricing and enterprise license agreements. Early in 2015, Microsoft substantially lowered the per user price for Power BI, putting downward pricing pressure on all vendors in the space. This downward pricing pressure has continued in 2017, with megavendors sometimes bundling modern analytics and BI capabilities with their traditional platforms or including them as part of maintenance. Customers show that they are willing to pay a premium for differentiating capabilities, particularly when they include new and emerging ones such as augmented analytics. Most buyers initially look at the licensing cost; however, Gartner continues to advocate that customers look at the TCO, which includes deployment, scale-up, differences in efforts in content authoring, and ongoing training and enablement. A number of vendors also introduced subscription-based pricing to lower the entry point for customers; however, this does not necessarily lower the long-term licensing cost — which is often higher than perpetual licensing after three to four years. In addition, user-based pricing has been ideal for a “land” model and for small deployments, but the cost has often been prohibitive for large-scale deployments. Most vendors now offer a both land and expand model, although not all have evolved their enterprise agreements equally well.
- Cloud past the tipping point. Analytics and BI in the cloud is now past the tipping point. Most net new deployments originating in the cloud, and more than 70% of this year’s reference customers, are already using a public or private cloud (versus just over 40% in 2017). This change has been highlighted by some offerings in this Magic Quadrant being cloud-only, or primarily cloud (such as Salesforce Einstein Analytics, SAP Analytics Cloud, IBM Watson Analytics and Domo). It is also indicated by the fact that some innovations are happening in the cloud first. In addition, cloud deployments have become more flexible, with hybrid data connectivity to on-premises data sources being more broadly supported. The concept of multicloud — in which customers can choose to run their SaaS analytics application in any cloud infrastructure as a service (IaaS) offering (such as Amazon Web Services and Microsoft Azure) — is in its infancy, with Qlik and SAP (for example) early to execute on this strategy. Some vendors also give customers the choice of deploying their own instances of an analytics and BI application in their IaaS provider of choice. In addition, vendors are now offering a single license that encompasses both on-premises and cloud users, with Microsoft first to market with this in Power BI premium, but more recently, with SAP, Oracle and Qlik developing these options. While a single, comprehensive license is an ideal starting point, customers also want the ability to manage and administer content and users across these deployment models; as yet, most vendors do not support this comprehensive deployment. Although the level of cloud deployments is high for the reference customers, contrastively, Gartner ITScore assessments show that 68% of Gartner customers taking these assessments have no cloud strategy, with only 7% in production. 1
- Supporting real-time events and streaming data and analysis will expand use cases. Organizations will increasingly want to leverage the streaming data generated by devices, sensors and people in a connected world in order to make faster decisions. The players in the analytics and BI market will need to invest in similar capabilities — in order to offer buyers a single platform on which to combine real-time events and streaming data with other types of source data. Also to develop a new breed of high-impact analytic applications that can leverage the power of real-time actionable insight.
- Marketplaces for content, data and algorithms will expand and mature, creating new opportunities for organizations to buy and sell analytic capabilities and speed time to insight. The availability of active marketplaces where buyers and sellers converge to exchange analytic applications, curated data sources, custom visualizations and algorithms are likely to grow in the analytics and BI space and contribute to its future growth. An established marketplace also provides BI vendors with a new channel — where solutions built on top of their platforms can be sold into their customer channel or partner networks. The main beneficiary of a mature marketplace is the end-user organization, which will gain access to a virtually limitless array of capabilities that can be leveraged in their own internally developed solutions and processes.
- With the next wave of market disruption, new and innovative vendors will continue to emerge, but this change should be considered as part of an overall strategy. During the next several years, buyers will benefit from significant market investment in innovation from large vendors, as well as from venture capital investment in innovative startups. The downside of having a plethora of innovative products to pilot, and vendors with which to engage in proofs of concept, is the tendency for organizations to incur technical debt over time. This can happen as multiple stand-alone solutions that demonstrate business value quickly (and often hastily) turn into production deployments, without adequate attention being paid to design, implementation and support. In this rapidly evolving market, organizations should be careful to limit their technical debt by developing a formal strategy and reference architecture to work within when evaluating their options; thus avoiding major rework and redesign efforts in the future.
Acronym Key and Glossary Terms
|ACV||annual contract value|
|AWS||Amazon Web Services|
|CPM||corporate performance management|
|ETL||extraction, transformation and loading|
|IoT||Internet of Things|
|KPI||key performance indicator|
|SDK||software development kit|
|SOR||system of record|
|TCO||total cost of ownership|
1 This data point is based on 837 Gartner ITScore assessments for analytics and BI taken from October 2016 to October 2017.
Total Software Revenue
Gartner defines total software revenue as revenue that is generated from appliances, new licenses, updates, subscriptions and hosting, technical support and maintenance. Professional services are not included in total software revenue (see “Market Share Analysis: Analytics and BI Software, 2016” ). Gartner’s analysis and the ratings and commentary in this report are based on a number of sources:
- Customers’ perceptions of each vendor’s strengths and challenges, as gleaned from their analytics and BI-related inquiries to Gartner
- An online survey of vendors’ reference customers
- A questionnaire completed by the vendors
- Vendors’ briefings, including product demonstrations, strategy and operations
- An extensive RFP questionnaire inquiring about how each vendor delivers the specific features that make up our 15 critical capabilities (see “Toolkit: BI and Analytics Platform RFP” )
- A prepared video demonstration of how well vendor BI platforms address specific functionality requirements across the 15 critical capabilities
- Access to evaluation software from each vendor
Online Survey for This Magic Quadrant
An online survey was developed and hosted by Gartner as part of its research. Vendor-provided reference customers (end-user customers and OEMs) and respondents from last year’s survey provided data. The survey was conducted from 8 September 2017 through 5 October 2017.
The survey results used in this document derive from 1,526 responses as follows:
- Vendor-identified reference customers (1,219) or 80%
- References from the previous year’s survey that also participated in this year’s survey (147) or 10%
- OEM reference customers (160) or 10%
Although this is a substantive pool of responses for directional inference, vendor reference data is not representative of the total analytics and BI market, but rather of the customers that elected to participate (see Table 3, which gives a breakdown of qualified respondents per vendor).
|Table 3. Qualified Responses by Vendor|
|Vendor ||Qualified Respondents |
Source: Gartner (February 2018)
Mode 1 and Mode 2 Definitions
A bimodal approach is the practice of managing two separate but coherent styles of work — one focused on predictability, the other on exploration:
- Mode 1 focuses on predictability and has a goal of stability. It is best used where requirements are well-understood in advance and can be identified by a process of analysis. It includes the necessary investment to renovate and open up the legacy environment for the digital world.
- Mode 2 is exploratory, involving experimentation to solve new problems, and optimized for areas of uncertainty. In this case, requirements are not well understood in advance. Mode 2 is best-suited for areas in which an organization cannot make an accurate and detailed predefined plan because not enough is known. Mode 2 efforts don’t presume to predict the future, but allow the future to reveal itself in small pieces. Work often begins with a hypothesis that proves true, proves false or evolves during a process that typically involves short iterations/projects.
Work that spans both modes forces development teams to manage dependencies involving speed of delivery, which is also impacted by the architecture and design of the applications involved. The ability to effectively integrate the more predictable evolution of products and technologies (Mode 1) with the new and innovative (Mode 2) is the essence of a mature bimodal capability.
Customer Survey Metrics Referenced in This Report
Magic Quadrant customer survey composite success measures are referenced throughout the report. Reference customer survey participants scored vendors on each metric, which were calculated as follows:
- Customer Experience: This is a combined score consisting of ratings for achievement of business benefits, availability of skills and user enablement (which includes scores for training, online videos, online communities and documentation), and is based entirely on survey reference responses.
- Operations: This is a combined score consisting of rating for product quality, support and ease of migration and is based entirely on survey reference responses.
- Sales Experience: Customers rate their satisfaction with presales, contracting, pricing and account management.
- Market Understanding: This is a composite measure of ease of use for consumers, ease of use for developers, visual appeal, and ease of use for administration and deployment; and, complexity of analysis (as described below). We believe these two measures map to current buying requirements.
- Complexity of Analysis: This is a combined score consisting of an analyst opinion rating of how well the platform handles complex data needs, and a survey-based weighted average score based on the score for percentage of respondents reporting use of the platform for the types of analysis users conduct with it. More interactive and advanced types of analysis result in a higher score than static or parameterized reporting. Activities are weighted as follows:
- Viewing static reports = 1
- Using parameterized reports and dashboards = 1
- Performing simple ad hoc analysis = 3
- Using predictive analytics and/or data mining models = 3
- Interactive exploration and analysis of data = 4
- Performing moderately complex to complex ad hoc analysis = 5
- Data integration and preparation = 2
- Analysts’ opinion of how well the platform handles complex data needs was also evaluated, based on an assessment of:
- Diversity of data source connectivity
- Ability to combine multiple data sources
- Support for streaming data
- Multipass SQL capabilities
- Ability to federate data
- User Enablement: This is a composite score consisting of individual ratings for documentation, online tutorials for content authors, online tutorials for consumers, online communities, training, availability of skills and user conferences.
- Business Benefits: The business benefits score is a score average taken from 10 different benefit areas, as follows:
- Increased revenue
- Better, faster decisions
- Improved customer satisfaction
- Reduced IT head count
- Reduced external IT costs
- Reduced non-IT costs
- Expanding types of analysis
- Making better insights available to more people
- Linking KPIs to corporate objectives
- Monetizing data
Change in Critical Capabilities From Last Year
Critical Capabilities Dropped or Changed:
- Combined Platform and Workflow Integration with Ease of Use, Visual Appeal and Workflow Integration.
- Renamed Embedded Advanced Analytics to Advanced Analytics for Citizen Data Scientist.
- Renamed Smart Data Discovery to Augmented Data Discovery.
- Scalability and Data Model Complexity; several of these subcriteria previously existed within Metadata Management and BI Platform Administration, Security, and Architecture.
Evaluation Criteria Definitions
Ability to Execute
Product/Service: Core goods and services offered by the vendor for the defined market. This includes current product/service capabilities, quality, feature sets, skills and so on, whether offered natively or through OEM agreements/partnerships as defined in the market definition and detailed in the subcriteria.
Overall Viability: Viability includes an assessment of the overall organization’s financial health, the financial and practical success of the business unit, and the likelihood that the individual business unit will continue investing in the product, will continue offering the product and will advance the state of the art within the organization’s portfolio of products.
Sales Execution/Pricing: The vendor’s capabilities in all presales activities and the structure that supports them. This includes deal management, pricing and negotiation, presales support, and the overall effectiveness of the sales channel.
Market Responsiveness/Record: Ability to respond, change direction, be flexible and achieve competitive success as opportunities develop, competitors act, customer needs evolve and market dynamics change. This criterion also considers the vendor’s history of responsiveness.
Marketing Execution: The clarity, quality, creativity and efficacy of programs designed to deliver the organization’s message to influence the market, promote the brand and business, increase awareness of the products, and establish a positive identification with the product/brand and organization in the minds of buyers. This “mind share” can be driven by a combination of publicity, promotional initiatives, thought leadership, word of mouth and sales activities.
Customer Experience: Relationships, products and services/programs that enable clients to be successful with the products evaluated. Specifically, this includes the ways customers receive technical support or account support. This can also include ancillary tools, customer support programs (and the quality thereof), availability of user groups, service-level agreements and so on.
Operations: The ability of the organization to meet its goals and commitments. Factors include the quality of the organizational structure, including skills, experiences, programs, systems and other vehicles that enable the organization to operate effectively and efficiently on an ongoing basis.
Completeness of Vision
Market Understanding: Ability of the vendor to understand buyers’ wants and needs and to translate those into products and services. Vendors that show the highest degree of vision listen to and understand buyers’ wants and needs, and can shape or enhance those with their added vision.
Marketing Strategy: A clear, differentiated set of messages consistently communicated throughout the organization and externalized through the website, advertising, customer programs and positioning statements.
Sales Strategy: The strategy for selling products that uses the appropriate network of direct and indirect sales, marketing, service, and communication affiliates that extend the scope and depth of market reach, skills, expertise, technologies, services and the customer base.
Offering (Product) Strategy: The vendor’s approach to product development and delivery that emphasizes differentiation, functionality, methodology and feature sets as they map to current and future requirements.
Business Model: The soundness and logic of the vendor’s underlying business proposition.
Vertical/Industry Strategy: The vendor’s strategy to direct resources, skills and offerings to meet the specific needs of individual market segments, including vertical markets.
Innovation: Direct, related, complementary and synergistic layouts of resources, expertise or capital for investment, consolidation, defensive or pre-emptive purposes. Geographic Strategy: The vendor’s strategy to direct resources, skills and offerings to meet the specific needs of geographies outside the “home” or native geography, either directly or through partners, channels and subsidiaries as appropriate for that geography and market.