Database Security

$50 million is raised by JSonar for AI-driven database security technologies.

Today, database security platform startup JSonar revealed that it had raised $50 million, the majority of which it intends to use for research and development and marketing initiatives. The fact that five out of the ten largest banks in the world are among JSonar’s customers makes it notable that this is the company’s first round of institutional funding. Enterprise data security depends critically on database activity. This is made more difficult by the use of cloud-based and dynamic, current data systems; traditional data activity security and logging solutions may fail to convert data into useful insights. Because their solutions are automated and AI-powered, JSonar asserts that they perform better. The models that underpin them convert petabytes of unprocessed database activity data into security recommendations with the ability to apply preventative controls in addition to detection. These restrictions and suggestions are helpfully incorporated into preexisting processes and into external DevOps services and platforms via prebuilt connectors. Users can create unique analytical algorithms using JSonar, which also automates reporting and governance procedures on top of massive databases. JSonar claims that by doing this, businesses avoid spending months establishing their own proprietary platforms. Virtually any database system, including infrastructure-as-a-service, platform-as-a-service, and database-as-a-service configurations, is supported by the JSonar suite on any cloud. This offers pre-installed support for over 60 other platforms, such as Amazon Web Services, Google Cloud Platform (GCP), Microsoft Azure, Snowflake, MongoDB, Cassandra, Hadoop, and Teradata.

JSonar is far from the only startup trying to corner it

According to Markets and Markets, the database security market is expected to be worth $7.01 billion by 2022, and JSonar is far from the only startup trying to corner it. There are also others, such as Netdata, which creates an open source database monitoring tool, and SolarWinds, which has long provided management and optimization tools for open systems and databases. But Ron Bennatan, the founder and CTO of JSonar and a previous cofounder of Guardium Database Activity (which IBM purchased in November 2009), thinks the use of automation by his business makes it stand out from the competition. In an email to VentureBeat, he said, “The rapidly changing enterprise landscape, including cloud adoption, an explosion of database platforms, the pressing need for data security beyond just compliance, and years of frustration over runaway costs, has created a huge opportunity for us to rapidly expand.” The present data landscape requires a fresh strategy because traditional database security solutions have shown to be too expensive to be widely adopted and to offer little more than a checkbox.

Leading the investment in JSonar was Goldman Sachs. David Campbell, managing director at Goldman Sachs, will join the startup’s board of directors as a result of the deal. Net carbon neutrality is a goal for enterprises all across the world, but we have a long way to go before we succeed. Using less energy at the beginning can bring us all together, particularly when it comes to energy-hungry data centres dealing with escalating workloads in the era of machine learning and artificial intelligence. One percent of the world’s electricity is consumed by data centres. The International Energy Agency notes a significant risk that the growing demand for resource-intensive applications, like AI in particular, will start to outpace the advancements of recent years, even though energy efficiency has helped to limit that growth despite the surge in demand for data services. Only 22% of CEOs reported making net-zero commitments in PwC’s annual poll, with another 29% attempting to make one and the largest corporations the most advanced. No matter where your company falls on that spectrum, when it comes to the use of IT resources, it is impossible to disregard the cost of energy as well as its environmental impact. At Pure Storage, we prioritise giving clients the tools they need to use data to make better decisions. Helping them choose how to accomplish a desired business outcome without spending money — or energy — to power data centres full of the IT equipment they use to make those wiser decisions is a vital part of that effort.

The balance between effectiveness and efficiency

In light of this, let’s think about the crucial role that energy efficiency plays in computing, from personal computers to data centres. Modern mobile CPUs combine efficient computation cores for performance and efficiency with dedicated neural processing units for machine learning applications. Smartphones and laptops may now give substantially longer battery life without losing computational performance thanks to these heterogeneous devices. As consumers of laptops and smartphones, we only see the results (fast, but long battery life). But the combination of specialised hardware capabilities and software that uses various resources wisely at precisely the appropriate time based on workload is the key to that energy proportionality. With a combination of conventional server central processing units (CPUs) and more workload-specific accelerators, modern datacenter facilities have comparable capabilities for heterogeneous computation (graphics processing units and the like). Massive-scale schedulers like Kubernetes can assist in placing workloads in a datacenter under the direction of a few policies from the operations and development teams. Heterogeneous computing is essential to ensuring energy economy and proportionality, whether for hyperscale datacenters or battery-optimized mobile devices. Additionally, an operating system that abstracts the many hardware architectures under it is essential for building workable systems from heterogeneous computing equipment.

An effective storage operating system

But what about datacenter storage? In the past, system architects would use whole distinct storage systems to meet various performance and capacity needs. As a result, they were left with a chaotic mix of infrastructure silos that each required specialised, frequently manual effort to manage workload placement and scheduling. That is wasteful and inefficient. With today’s technology, we can create highly varied design points in the performance and capacity space within a single architectural family, as Pure has done. This technology ranges from quad-level cell flash to storage-class memory. For instance, Meta considered power efficiency while deciding to use Pure in its brand-new AI Research SuperCluster (RSC), which Meta hopes to be the world’s fastest AI supercomputer.

For a lot of organisations, the rigid and complicated legacy infrastructure technology stands in the way of realising the promise of AI, which is a new era of intelligence. With a novel combination of hardware and software innovation that decreases power consumption and overall data centre footprint, Pure’s newest family of solutions delivers a generational leap in both power efficiency and performance to help overcome those barriers. Storage can be updated flexibly and without disruption thanks to a modular architecture that separates storage compute resources and capacity, providing a platform that is highly scalable and customisable to efficiently target a variety of current workloads.

Imagining a storage management operating system on a datacenter scale

The ability to separate storage platforms offers greater flexibility for building highly effective IT infrastructures. Administrators can not only put together the ideal mix of IT resources for a specific task to reduce expenses, but they can also upgrade various resources individually as necessary. The new Pure platform uses a nearly infinitely scalable metadata architecture to expand over time in accordance with customer requirements and provides more than double the density, performance, and power efficiency of earlier iterations. Additionally, by performing better on important metrics like capacity per watt, bandwidth per watt, and capacity per rack-unit, it aids end users in meeting sustainability criteria and reduces the overall data centre footprint. But where is the operating system for managing storage at datacenter scale? Currently, manual data management is necessary to increase productivity for workloads confined to inefficient infrastructure silos, which slows the advancement of sustainability, cost, and agility goals. That conundrum is resolved by delivering storage as code. By embracing the agility and scalability of the cloud computing model, Pure Fusion, Pure’s autonomous storage model, enables customers to achieve superior results through automation. Storage pools are continuously optimised via intelligent workload management, which dynamically rebalances workloads. At Pure, we take pride in advancing science-based innovation so that customers may create more intelligent and autonomous data centre policies that increase energy efficiency on a grand scale. And that’s advantageous for the environment, business, and end consumers.

Table of Contents