insideBIGDATA Latest News – 4/27/2021

In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.

Actian Powers Journey to Google Cloud with Availability of Avalanche™ Cloud Data Warehouse

Actian, a leader in hybrid cloud data analytics, announced the immediate availability of its Avalanche hybrid cloud data warehouse on the Google Cloud Marketplace. Organizations can now use Avalanche to power some of their most demanding operational analytics and BI workloads, achieving up to 12x better performance than alternatives at a fraction of the cost. Avalanche is delivered as a turnkey, fully managed service that utilizes Google Kubernetes to deploy in minutes and is easy to scale up and down based on business need. Avalanche also includes built-in data integration, easing the movement of data in and out of the data warehouse.

<!–

–>

<!–

–>

“The immediate availability of Avalanche on the Google Cloud Marketplace now enables customers to gain real-time insights from their operational data,” said Vikas Mathur, SVP, Products at Actian. “As more organizations connect to existing and new data sources, we are going all in on our collaboration with Google Cloud to help customers achieve success in the current landscape. We are excited to deliver best in class analytics performance, breakthrough cost savings, and ease of deployment to customers via the Google Cloud Marketplace.”

ScaleOut Software Announces Geospatial Mapping for its ScaleOut Digital Twin Streaming Service™

ScaleOut Software released new visualization capabilities for its Azure hosted ScaleOut Digital Twin Streaming Service™ and companion, on-premises streaming analytics platform. This innovative streaming analytics approach enables organizations to separately process and analyze incoming streaming data from thousands of data sources and to gain immediate insights that previously required offline batch processing to uncover. With the addition of geospatial mapping for real-time continuous queries, users can now visualize key analytics results with richer contextual information and boost their situational awareness of complex, dynamic systems.

Geospatial mapping of streaming analytics results brings data to life for numerous applications. For example, telematics systems can track thousands of vehicles and immediately identify on a map which vehicles need the assistance of a dispatcher based on continuous, real-time analysis of their incoming telemetry. Mapping of query results makes unusual situations, such as highway blockages or congregated drivers, immediately apparent. Security and safety applications for industrial infrastructures and smart cities benefit from this technology by enabling personnel to see curated, real-time data about multiple potential threats so that their relationships and dynamic changes can be immediately assessed.

“We are excited to add powerful new visualization features to our ScaleOut Digital Twin Streaming Service,” said Dr. William Bain, ScaleOut Software’s CEO and founder. “Geospatial mapping helps our users fully benefit from the powerful real-time capabilities of memory-based, digital twin technology. We believe that combining digital twins with geospatial mapping creates an important breakthrough in extracting value from streaming data.”

Aigenpulse launches CytoML 5.2: automated flow cytometry with unbiased analysis

Aigenpulse has rolled out an update to its CytoML Experiment Suite – its automated, end-to-end, machine learning solution specifically aimed at streamlining and automating cytometry analysis at scale and replacing manual gating processes. The latest release of the Suite (v5.2) introduces new unbiased analysis features and has an easy-to-use interface with no need for difficult installation or programme scripting. Users can perform automated analyses in an unbiased manner for exploratory use cases, including FlowSOM and Phenograph for algorithm-based clustering, and use powerful dimensionality reduction methods such as tSNE and UMAP to visualise connected data.

“Where researchers need data to support a regulatory use cases, guided/semi-automated analysis is key because it is 100% reproducible. However, there is a depth of rich data that underpins the information provided by flow cytometry, and here, unbiased analysis for exploratory use cases can help uncover new insights by finding novel populations or clustering non-intuitive populations together, for instance. Unbiased analysis tools allow complex multi-dimensional data to be simplified, unified, processed and visualised so that it can be more easily explored and compared. This kind of analysis can be very useful in exploring data without any prior assumptions, as a means to uncover novel insights. It is a complementary technique to semi-automated approaches and is interoperable within the CytoML 5.2 Suite, enabling comparison and validation.”

Leader in AI Research Introduces Object Manipulation in Robotics Testing Scenario

The Allen Institute for AI (AI2) announced the 3.0 release of its embodied artificial intelligence framework AI2-THOR, which adds active object manipulation to its already impressive testing framework. ManipulaTHOR is a virtual agent with a highly articulated robot arm equipped with three joints of equal limb length and composed entirely of swivel joints to bring a more human-like approach to object manipulation. 

AI2-THOR is a testing framework to study the problem of object manipulation in more than 100 visually rich, physics-enabled rooms. By enabling the training and evaluation of generalized capabilities in manipulation models, ManipulaTHOR (research paper) allows for much faster training in more complex environments as compared to current real-world training methods, while also being far safer and more cost-effective.

“Imagine a robot being able to navigate a kitchen, open a refrigerator and pull out a can of soda. This is one of the biggest and yet often overlooked challenges in robotics and AI2-THOR is the first to design a benchmark for the task of moving objects to various locations in virtual rooms, enabling reproducibility and measuring progress,” said Dr. Oren Etzioni, CEO at AI2. “After five years of hard work, we can now begin to train robots to perceive and navigate the world more like we do, making real-world usage models more attainable than ever before.”

[embedded content]

Pricefx Launches Industry’s First AI-powered Market Simulation for Price Optimization

Pricefx, a leader in cloud-native pricing software, announced it has released an AI-powered market simulation solution. Market simulation is designed to enable price optimization in context of the overall product portfolio and the competition. It uses Pricefx Next Gen AI technology to simulate and predict the impact of pricing on customer purchasing behavior. As a result, businesses can make better business decisions – informed by predicted impact – to accelerate profit growth.

Leveraging AI optimization, Pricefx provides businesses a simulated response to various “what if” scenarios. Market simulation evaluates a range of products, the impact on volumes and optimizes pricing for the full portfolio. It simulates the impact of price change to a product on the surrounding products within the business’ own portfolio to identify potential market cannibalization. It predicts real behavior of customers and competitors by simulating the market response to price changes and demonstrates the long-term market impact including market share changes.

“Market simulation is the next evolution of price optimization,” said Toby Davidson, Chief Product Officer for Pricefx. “AI-powered price optimization is improving business’ viability, driving more revenue and accelerating profit growth. By adding AI-based market simulation, Pricefx continues to deliver real solutions for our customers’ business challenges. We are extending our leadership position in pricing optimization to deliver new and powerful profitability tools to our customers, with the same fast, flexible and friendly approach they have come to expect.”

VAST Data Launches Federal Subsidiary and Bolsters Investment in the Public Sector Market

VAST Data, the storage software company breaking decades-old tradeoffs, unveiled VAST Data Federal, a new subsidiary to support government and defense organizations in unlocking the full potential of their data and enabling applications of the future. The launch of VAST Data Federal further extends the company’s presence in this market, having already closed over $25M in deals with various federal agencies including the U.S. National Institutes of Health.

“From next-gen geospatial imagery to all-flash data lakes for cybersecurity, the public sector has already been an early adopter of our Universal Storage technology to unleash new insights, discoveries, and innovations from their growing datasets,” said Randy Hayes, Vice President of VAST Data Federal. “With VAST Data Federal, we are doubling down on our investment in the space by bringing in a team of experts with the proper credentials to further demonstrate to the market the performance, scale, and cost benefits associated with our Universal Storage platform.”

GoodData Launches Cloud-Native Platform as First Step in New Data as a Service Category

GoodData, a leading global analytics company, launched their new cloud-native analytics platform offering, GoodData Cloud Native (GoodData.CN). The platform represents step one in the definition and implementation of the new Data as a Service (DaaS) category and ecosystem. The analytics industry is in the midst of an accelerating transformation, marked by the rise of cloud data warehouses and the realignment of the data value chain. Data-driven decision making is increasingly business-critical, and companies are quickly realizing that the status quo in business intelligence — an ungoverned, batch-oriented, and costly monolithic approach — is  no longer worth the investment. DaaS, as a new model, ushers in a modern era of open, API-first, cloud-based, and real-time analytics. 

“If you look at the transformation going on in the market right now, DaaS is the logical next step. DaaS will change the way analytics is done at a fundamental level,” said GoodData founder and CEO Roman Stanek. “Companies have been overspending and seeing little return for more than a decade. The time for experimentation with analytics is over and with the introduction of Data as a Service, and the rise of new platforms like the one we’re launching today, we can unlock an entire new era of analytics.”

MLCommons™ Releases MLPerf™ Inference v1.0 Results with First Power Measurements

MLCommons, an open engineering consortium, released results for MLPerf Inference v1.0, the organization’s machine learning inference performance benchmark suite. In its third round of submissions, the results measured how quickly a trained neural network can process new data for a wide range of applications on a variety of form factors and for the first-time, a system power measurement methodology.

MLPerf Inference v1.0 is a cornerstone of MLCommons’ initiative to provide benchmarks and metrics that level the industry playing field through the comparison of ML systems, software, and solutions. The latest benchmark round received submissions from 17 organizations and released 1,994 peer-reviewed results for machine learning systems spanning from edge devices to data center servers. To view the results, please visit https://www.mlcommons.org/en/inference-datacenter-10/ and https://www.mlcommons.org/en/inference-edge-10/.

“As we look at the accelerating adoption of machine learning, artificial intelligence, and the anticipated scale of ML projects, the ability to measure power consumption in ML environments will be critical for sustainability goals all around the world,” said Klaus-Dieter Lange, SPECpower Committee Chair. “MLCommons developed MLPerf in the best tradition of vendor-neutral standardized benchmarks, and SPEC was very excited to be a partner in their development process. We look forward to widespread adoption of this extremely valuable benchmark.”

Amplitude Unveils Digital Optimization System

Amplitude introduced a Digital Optimization System, to manage, measure and optimize the business value of digital product innovation. Building on its #1 rated digital product analytics suite, Amplitude also announced the general availability of its new personalization product, Amplitude Recommend. Recommend is the first personalization solution which leverages customer behavior data in the digital product and machine learning models to determine which behaviors result in the optimal business outcome, such as conversion to purchase or average order size. The system then adapts each individual experience based on these insights to optimize the desired outcome.

“When we started Amplitude, we had a vision to help every company build better digital products through an understanding of how people interact with them,” said Spenser Skates, Amplitude CEO and co-founder. “Today, this vision has never been more important in a world where digital is business survival. We don’t need more ‘digital experiences’ by delivering more features and content measured by ad clicks and web visits. It’s time for a new way. The winners in the next era will be the companies that understand their customers and use these insights to transform experiences from the place value is created: the digital product. That’s why we built the Digital Optimization System. With Amplitude, now every company has a fighting chance to be a digital disruptor.”

Claravine Introduces The Data Standards Cloud™ to Address Enterprise Data Integrity Challenges

Claravine, a leader in Data Integrity, announced The Data Standards Cloud™ allowing data owners and data engineers to finally be able to deliver on enterprise data standards. No longer do teams need to rely solely on their data pipelines to be responsible for making sense of data. Now, organizations can leverage Claravine to manage their data standards and create data integrity globally, providing consistent and quality information to optimize business outcomes. 

“So much of the investment going into data systems and data teams is reactive. At Claravine, we believe in a proactive approach that supports a true enterprise data strategy. We support our customers’ efforts to create data integrity by helping them own their data standards through our technology and collaboration. More ETL isn’t the answer. ETL is a time-consuming and soul-crushing process that occupies far too much of data engineers’ and data scientists’ time, and it keeps them from focusing on data analysis and optimizing business outcomes. The only way to eliminate ETL is by empowering data owners to standardize, connect, and control their data standards in a platform like Claravine,” said Verl Allen, chief executive officer of Claravine.

Sequitur Labs Presents Best Practices for Securing AI/ML Models at the Edge

While many businesses have invested in securing their data centers to ensure their protection from outside intruders, hacks and ransomware incidents, the vast majority have failed to shore up interconnected access points from edge devices, leaving them susceptible to attack. What is needed are solutions that provide device-level security that addresses all the technical, IP, supply chain and business process challenges manufacturers face without the need for them to become experts in cryptography and complex hardware security technologies themselves say experts at Sequitur Labs, the leader in IoT security for connected devices.

With the increased proliferation of IoT devices available in the marketplace today, there are inherently more vulnerable access points than ever. These “smart” devices – from phones to thermostats to autonomous vehicles and beyond – are designed with interconnectivity built into the products to satisfy internet access requirements needed to receive critical updates to firmware or functionality. While secure data centers may be thought to be safe from outside intrusion, IoT devices are often at risk of being exploited by those looking for an easier way into systems.

“IoT device developers need to ensure their products are protected from attacks, safe and secure through the manufacturing process, and able to be managed securely throughout the life of the product,” said Philip Attfield, Co-founder and CEO, Sequitur Labs. “Without the appropriate implementation of IoT Security, vendors risk damage to their products, credibility and brand, as well as the loss of critical IP that is used to conduct complicated tasks that require some level of intelligent functionality with access to sensitive code or data sets.”

Leading Global Retailers Using Qlik to Drive More Value from All Their Data

Retailers are rapidly adopting Qlik® as part of their effort to maximize the value of real-time data for decision-making. Over the past two quarters, retailers have been eager to realize Qlik’s vision of Active Intelligence, which is the ability to leverage real-time, up-to-date information to inform decision-making and trigger downstream business events to capitalize on every business moment in rapidly changing market conditions.

Facing razor-thin margins in normal circumstances, COVID-19 pushed retailers to quickly reimagine their supply chains, warehouse and distribution center operations, as well as their delivery models and their physical/digital mix. Even retailers with robust digital presences realized, to make the necessary pivots, they needed to further break down internal data silos, and bring even more real-time, context-aware data to their decision-making. This has driven tremendous interest in adopting Qlik – which serves 2 of the top 3, and 5 of the top 10, NRF Top 100 Retailers* – as a catalyst to quickly modernize data pipelines and analytics to support newly imagined operations models with data that drives action, reduces churn and maintains customer satisfaction.

“More and more retail companies are shifting from passive to active business intelligence models, which emphasize the need for real-time data to inform decision-making,” said Poornima Ramaswamy, Executive Vice President, Global Solutions and Partners at Qlik. “Active Intelligence, realized through our end-to-end platform, drives performance improvements and value across our retail customers’ organizations by bringing the right data and insights to decision makers when it’s most relevant and impactful.”

BriefCam Extends Advanced Video Analytics Platform for Multi-Site Deployments to Boost Operational Intelligence and Situational Awareness Across Sites

BriefCam, a leading provider of Video Content Analytics solutions, released BriefCam v6.0, which introduces the new deployment option of a multi-site architecture. This enables businesses with multiple, distributed locations to view aggregate data from all remote sites to uncover trends across locations, optimize operations and boost real-time alerting and response – all while continuing to reap the benefits of BriefCam’s powerful analytics platform for making video searchable, actionable and quantifiable.

“As the adoption of video analytics has grown, businesses have learned that its value goes beyond safety and security to encompass business intelligence and operational efficiencies,” said Igal Dvir, BriefCam VP, Product & Technology. “Our multi-site architecture takes it a step further, allowing our customers to glean lessons from their highest performing localities in order to maximize the success of all locations.”

GigaIO Introduces New Scalability for AI Workloads with FabreX 2.2 For Dynamically Configured Rack-scale Architectures

GigaIO, the creators of next-generation data center rack-scale architecture for AI and High-Performance Computing solutions, announced FabreX release 2.2, the industry’s first native PCI Express (PCIe) Gen4 universal dynamic fabric, which supports NVMe-oF, GDR, MPI, and TCP/IP. This new release introduces an industry first in scalability over a PCIe fabric for AI workloads, by enabling the creation of composable GigaPods TM and GigaClusters TM with cascaded and interlinked switches. In addition, FabreX 2.2 delivers performance improvements of up to 30% across all server-to-server communications through new and improved DMA implementations.

”With our revolutionary technology, a true rack-scale system can be created with only PCIe as the network. The implication for HPC and AI workloads, which consume large amounts of accelerators and high-speed storage like Intel Optane SSDs to minimize time to results, is much faster computation, and the ability to run workloads which simply would not have been possible in the past.” said Alan Benjamin, CEO of GigaIO.

TigerGraph Continues to Drive Graph Analytics and AI Market Momentum, Unveils TigerGraph Cloud on Google Cloud Platform and Expanded Global Developer Community

TigerGraph, provider of a leading graph analytics platform, announced that the company continues to accelerate the adoption, application, and use of graph analytics on the cloud with broadened support across all cloud providers. TigerGraph Cloud is accessible with Amazon AWS, Microsoft Azure, and now, GCP. The company also announced connectors for Snowflake and Tableau, meaning users can access relationship analytics directly from their Snowflake and Tableau dashboards; valuable data insights are now just a few clicks away.

“AI augmentation will create nearly $3 trillion in business value in 2021, with graph being recognized as a foundational capability within today’s organizations,” said Dr. Yu Xu, founder and CEO of TigerGraph. “Businesses have struggled to capitalize on AI because it requires contextual awareness and an understanding of the data across multiple entities. Graph technologies can answer these critical business questions in minutes, where it takes weeks with other data management systems. Our vision behind hosting Graph + AI is to unite the ecosystem, and we’re seeing more of these attendees this year. The fact that there is a 200% increase in Graph + AI Summit registrants — just six months after the first Graph + AI World event — speaks to the ever-increasing appetite for graph among today’s forward-looking companies.”

AtScale Delivers “Live” Query Experience for Microsoft Power BI on Cloud Data Platforms

AtScale, a leading software provider for advanced enterprise analytics, announced its support for Microsoft Power BI’s native Data Analysis Expressions (DAX) interface. Microsoft Power BI customers can use Microsoft’s “live” connection mode with AtScale to provide fast, seamless data access to popular cloud data platforms. As organizations continue to move their enterprise data into cloud platforms, the ability for business intelligence (BI) and data science teams to easily access live data is critical. AtScale’s Semantic Layer platform makes data accessible and “analytics-ready” for Microsoft Power BI users, without the need for data imports or redundant data modeling.

“Customers are adopting Microsoft Power BI in large numbers – and they demand best-in-class integration for their users,” said Christopher Lynch, executive chairman and CEO of AtScale. “With this new integration, AtScale delivers an excellent customer experience for Power BI users looking for ‘live’ data access on cloud data platforms.”

BMC Enables IT to Prioritize, Predict, and Act with New AIOps and AISM Capabilities

BMC, a global leader in software solutions for the Autonomous Digital Enterprise, introduced the new AI-driven IT Operations (AIOps) and AI-driven Service Management (AISM) capabilities for the BMC Helix portfolio that will enable IT service and operations teams to predict issues better, resolve them faster, and provide always-on service. These capabilities are powered by the new BMC Helix Platform, which delivers open, cross-domain engagement, observability, and actionability.

[embedded content]

Lightup Announces Beta Program for Breakthrough Data Quality Monitoring Solution to Make Data Decisions and Applications Dependable

Lightup, developers of a breakthrough data quality monitoring solution, announced the launch of its beta program to address the growing problem of proactively detecting and identifying issues with data that can have a devastating impact on data-driven application behavior and decisions. Lightup, which is backed by Andreesen Horowitz,  is a developer-first solution that can be up and running in minutes, providing organizations with an ideal solution for ensuring data quality for SQL data stores such as Snowflake and Databricks and streaming data sources including Kafka and Segment.

“While it is well understood that data is the oxygen that fuels every application and process in an organization, companies are flying blind when it comes to understanding the health of data driving their applications,” said Manu Bansal, co-founder, and CEO, Lightup. “With the data volumes, high cardinality, and complex data flows that we are all dealing with today, it is easy to end up with bad data in the pipeline. Lightup’s data quality monitoring solution provides data teams with a crystal clear understanding of the health and quality of the data fueling their applications. This ensures that data outages don’t silently turn into broken applications that can have a devastating impact on a company’s performance and bottom line.”

Tecton Unveils Major New Release of Feast Open Source Feature Store, the Fastest Path to Production for Machine Learning Data

Tecton, the enterprise feature store company and primary contributor to Feast, announced Feast 0.10, the first feature store that can be deployed locally in minutes without dedicated infrastructure. The new release makes it possible for data scientists to reap the benefits of a functionally complete feature store with no infrastructure overhead or maintenance. Feast has seen strong adoption to date with more than 1,700 GitHub stars and contributions from Agoda, Cimpress, Farfetch, Google Cloud, Tecton and Zulily.

“We originally open sourced Feast to share our feature store technology and accelerate the deployment of all ML-powered applications. Feast 0.10 is a major milestone towards making feature stores easy to adopt for data teams that are just getting started in their operational ML journey,” said Willem Pienaar, creator and an official committer of Feast and architect at Tecton.

Rockset Enables Real-Time Analytics for MySQL and PostgreSQL

Rockset, the real-time indexing database company, announced new integrations for MySQL and PostgreSQL. With these integrations, developers can run sub-second, high-concurrency analytics on data from MySQL and PostgreSQL, in real time. With Rockset’s new MySQL and PostgreSQL integrations, developers can rely on their favorite relational databases for online transaction processing (OLTP), while using Rockset to power real-time analytics at cloud scale. This significantly reduces the load on the primary OLTP database, because Rockset handles the heavy analytical queries that would otherwise add significant cost and risk to the primary database.

“MySQL and PostgreSQL are among the most popular databases in the world, however developers still find them impossible to scale as the volume of data grows. They either are forced to vertically scale existing databases by adding more compute resources, or give up on them completely in favor of a more horizontally scalable option,” said Venkat Venkataramani, co-founder and CEO at Rockset. “These alternatives are painful and no longer acceptable in our cloud-first world, one where scale and speed are of utmost importance. Our new integration for MySQL and PostgreSQL is yet another example of Rockset removing the many barriers developers face when building modern data applications.”

Coveo Hosts International Data Science Challenge for AI and Ecommerce Research

Coveo, a leader in AI-powered relevance platforms that transform search, recommendations, and personalization within digital experiences, announced it will host the 2021 Special Interest Group on Information Retrieval (SIGIR) eCom Data Challenge, leading efforts to further ecommerce research. For the SIGIR eCom Data Challenge, researchers will work to solve two core ecommerce problems — product recommendations and cart abandonment predictions — using an anonymized data set containing millions of shopping sessions from an average ecommerce website.

“We’re honored that SIGIR eCom chose Coveo to host this year’s challenge for what is always a steep competition. I believe it’s a testament to our team’s AI work over the last year, in which we had more than a dozen peer-reviewed papers published,” said Ciro Greco, Director of AI at Coveo. “We’re excited to see what insights the scientific community can derive from our data and to continue to do our part to expand AI and ecommerce research.”

Pega Process AI Delivers Self-Optimizing Process Automation

Pegasystems Inc. (NASDAQ: PEGA) announced Pega Process AI – a new set of Pega Platform™ capabilities that help organizations optimize their business and customer operations in real time. By infusing self-optimizing AI and decision management into its low-code process automation software, Pega offers the only solution that can intelligently triage millions of incoming customer requests, transactions, and other events at enterprise scale. This enables fast and effective event resolutions while helping to lower operating costs and simplify employee and customer experiences.

“Slick user interfaces quickly lose their luster with customers if the back-end processes driving the actual work are too slow and inefficient to deliver on brand promises,” said Don Schuerman, CTO and vice president of product marketing, Pegasystems. “Pega Process AI combines two of Pega’s most advanced solutions – AI and intelligent automation – to help ensure promises made at the front end are promises kept at the back end. By infusing AI into our deep expertise with case management and process automation, we help clients more efficiently and effectively serve their customers and assist their employees.”

Alluxio Improves Interface Support to Accelerate and Simplify Onboarding of Even More Data Driven Applications

Alluxio, the developer of open source cloud data orchestration software, announced the immediate availability of version 2.5 of its Data Orchestration Platform featuring access via POSIX and S3 interfaces enabling data platform teams to accelerate data pipelines for both business intelligence and model training using frameworks such as Tensorflow and PyTorch.

“For modern AI / ML data pipelines, the preferred application programming interface (API) for storage access is not HDFS,” said Haoyuan Li, Founder and CEO, Alluxio. “With this release, Alluxio significantly improves support for model training pipelines with an accelerated POSIX API for unified storage access, performance and ease of management.”

InfluxData Releases InfluxDB Notebooks to Enhance Collaboration for Teams Working with Time Series Data

InfluxData, creator of the leading time series database InfluxDB, announced the general availability of InfluxDB Notebooks, a new capability that improves communication for software development teams, ultimately enhancing productivity within InfluxDB Cloud. InfluxDB Notebooks is the first of the company’s new capabilities designed to make it easier for developers to collaborate around time series data within the platform.

Developers rarely work alone; they almost always work in teams. One of the challenges of working together is communication — developers need seamless communication to streamline their development activities. InfluxDB Notebooks addresses this challenge by allowing developers to discuss time series data analyses and trends inside the platform rather than switching conversations to third-party messaging applications, which can slow them down significantly. This new capability allows users to create a durable artifact that shows teams how time series data is analyzed to solve business problems.

“Development teams are more distributed than ever, but until now, they haven’t had the tools they need to seamlessly communicate around time series data,” said Russ Savage, director of product management at InfluxData. “To solve this problem, we’ve reimagined InfluxDB as a way to collaborate around data, not just store it. This new approach will dramatically save time for developers, so they can focus on building software.”

Narrative Launches Limited Preview of Data Shops, Enabling Companies to Quickly Stand Up a Data Business

To simplify the creation of revenue-generating data businesses, Narrative, the data streaming platform that makes it easy to buy, sell and win, announced that it has launched a limited preview of Data Shops. This new Narrative offering helps organizations package, sell, and deliver datasets via their own branded digital storefront, and is immediately available through the company’s Early Access Program.

Data Shops makes it fast and easy for companies to build a new revenue stream by monetizing the data they already collect. It enables data owners to warehouse any type of data—from any source and any industry—package and price it, and sell it via a customized online store. With Data Shops, owners retain full control of their data go-to-market strategy, including licensing terms and who can access their shop.

“Our goal with Data Shops is to make it as easy to spin up a store for data as Shopify has made it to create an e-commerce experience for physical goods,” said Narrative CEO and founder Nick Jordan. “We take the Shopify analogy one step further by providing logistical support as well, from warehousing data to delivering it to buyers. We’ve created a turnkey, end-to-end data monetization experience that will allow companies with interesting data to monetize it almost immediately.”

Masergy Enhances AIOps to Help Companies Improve Cloud Application Performance

Masergy, the software-defined network and cloud platform for the digital enterprise, enhanced its Masergy AIOps feature by applying artificial intelligence (AI) and machine learning to optimize Software as a Service (SaaS) applications on global networks. The advancements help companies of all sizes to more quickly and easily solve the problems of application management while also automating IT processes and preventing performance degradation.

Businesses need enterprise cloud applications to be readily available to their employees no matter where they are, and yet it remains difficult for IT teams to ensure a high-performance user experience. Complex IT infrastructures and multi-cloud environments obscure visibility, requiring AI analytics to effectively identify and solve the root causes of performance degradation.

“This is the next innovation and another step forward on Masergy’s path to offering a fully autonomous network,” said Chris MacFarland, CEO, Masergy. “While our clients benefit from the automated analysis and intelligent recommendations of AIOps, Masergy is delivering on the future faster than our competitors. Our AI-powered cloud networking platform is pushing the boundaries of what’s possible.”

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1