Welcome!

@CloudExpo Authors: Pat Romanski, William Schmarzo, Stefana Muller, Elizabeth White, Karthick Viswanathan

News Feed Item

Kapow Software Launches Kapow Katalyst 9.0 with New Ability to Automate Custom Tasks and Rapidly Integrate Any Application

Kapow Software, which enables rapid integration of any application or data source without coding or APIs, today announced the release of its new Kapow Katalyst Application Integration Platform 9.0. It is the first software platform of its kind to feature Integration-as-a-Self-Servicethrough the introduction of lightweight end-user apps, Kapow Kapplets, that empower employees and IT leaders to collaborate in new ways to help modernize the workplace, increase productivity and improve business results. Other benefits include Hadoop connectivity, Cloudera certification and a new partnership with Informatica.

Kapow Kapplets put the power of big data more directly into the hands of business users for the first time. Employees simply describe what they need to their IT department, and the Kapow Katalyst 9.0 platform makes it fast and easy to integrate data and applications. Kapow Kapplets are made available to workers through a user friendly app icon that can be clicked on to run and manage the automated workflow, extending the company’s self-service platform to line of business employees. This new capability enables employees, customers and partners to run, access, and control the automation and integration of disparate systems and data sources.

Using Kapow Katalyst 9.0, business users can access and harness big data, social media, cloud and mobile content much more simply, while keeping corporate assets and business processes safe and intact. Importantly, corporate IT leaders can integrate applications and create custom automations without costly, time-consuming coding or dependency on APIs. It satisfies companies' need to roll out new enterprise apps that workers demand, and helps connect people, processes and data across a variety of software applications, platforms and devices. With Kapow Software’s approach, enterprise app rollout is reduced to just hours or days instead of weeks and months.

“There’s been a convergence of mobile, social, cloud and big data. It’s all moving very quickly. Business and IT leaders are now re-inventing how work gets done. The sheer amount of data and its velocity and variety can be overwhelming for employees and IT departments to manage,” said John Yapaola, CEO, Kapow Software. “And yet, there’s a big, strategic opportunity to rapidly integrate information and automate tasks using a veritable treasure trove of data, analytics and insights for business advantage and profit. One big bottleneck has been the cost, time and complexity of integrating with applications that don’t have APIs and the corresponding need to hire big consulting firms.”

With Kapow Katalyst 9.0, business users can collect and integrate all types of data - internal and external, structured and unstructured - from a limitless array of sources. Through the ability to integrate data and applications for real-time intelligence while keeping IT policies and business processes in place, it helps solve a complex and growing IT dilemma facing most corporations today. Nearly 80 percent of the data generated each year is unstructured (IBM1), and approximately 80 percent of enterprises using Hadoop are challenged in staffing and training (Ventana Research2).

Kapow Katalyst 9.0 provides a broad array of data-storage options and processing flexibility for big data projects. It comes enhanced with a Hadoop custom-built connector, an Informatica PowerExchange for Kapow Katalyst Adaptor and partner certification for Cloudera’s Distribution including Apache Hadoop (CDH4). The software platform includes a dedicated tool for automating the interface that extracts information and loads it to big data frameworks.

The introduction of Informatica PowerExchange for Kapow Katalyst Adaptor enables quick access and extraction of disparate data sources outside the firewall. It is an extension of Kapow Software’s big data strategy of making it simple to integrate, enrich and deliver unstructured content from any Web-based data source, specifically those websites and cloud applications that are difficult to access.

Enterprise technologies have no proving ground more rigorous than financial services, where Kapow Kapplets offer compelling advantages. “Consider investment-bank traders and analysts,” said Rick Kawamura, VP of Marketing at Kapow Software. “Kapow Kapplets give them one-click, real-time access to research insights from the goldmine of big data. Freed from dependence on pricey data feed vendors or consultants to code custom integrations through a lengthy process, they can be up and running in a matter of hours with automated queries designed to harvest critical market data in real-time.”

ABOUT KAPOW SOFTWARE

Kapow Software is a leading innovator in the integration market with its intuitive, powerful application integration platform that harnesses the power of cloud, mobile, social and big data without the need for consultants, coding or APIs. Its Integration-as-a-Self-Service™ platform is the first of its kind to empower employees and IT leaders to collaborate in new ways to help modernize the workplace, increase agility and improve business results. Kapow Software is trusted by more than 600 large global enterprises, including Audi, Intel, Fiserv, Deutsche Telekom and more than a dozen federal agencies for process automation, application integration, big data collection, mobile enablement, content migration, web data intelligence, social media monitoring and other mission-critical solutions. For more information, please visit: www.kapowsoftware.com.

###

1 Source: IBM: 80 percent of our global data is unstructured (so what do we do?), Computer Weekly.com, October 26, 2010, http://www.computerweekly.com/blogs/cwdn/2010/10/ibm-80-percent-of-data-is-unstructured-so-what-do-we-do.html

2 Source: Hadoop and Information Management: Benchmarking the Challenges of Enormous Volumes of Data, Ventana Research, 2012

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

CloudEXPO Stories
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is being used on IBM Cloud, Amazon, and Microsoft Azure and how to gain access to these resources in the cloud... for FREE!
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, shared success stories from a few folks who have already started using VM-aware storage. By managing storage operations at the VM-level, they’ve been able to solve their most vexing storage problems, and create infrastructures that scale to meet the needs of their applications. Best of all, they’ve got predictable, manageable storage performance – at a level conventional storage can’t match. ...
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, will outline ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and storage creates a platform for consolidation, migration and enabling digital transformation.
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addressed the challenges of scaling document repositories to this level; architectural approaches for coordinating data; search and storage technologies, Solr, and Amazon storage and database technologies; the breadth of use cases that modern content systems need to support; how to support user applications that require subsecond response times.