Click here to close now.

Welcome!

Cloud Expo Authors: Pat Romanski, Elizabeth White, William Schmarzo, Liz McMillan, Dana Gardner

News Feed Item

LucidWorks Big Data Now Available

Successful Beta Program Fuels Industry's First Integrated Application Development Platform for Big Data Killer Apps

REDWOOD CITY, Calif., Nov. 15, 2012 /PRNewswire/ -- Many organizations developing Big Data applications first focus on trying to control, manage and secure the burgeoning volumes of data being generated. However, that step just scratches the surface of unlocking the data's ability to power competitive advantage. It is only when these multi-structured data stores become accessible to search and discovery that the information is transformed into corporate assets from which business insights can be derived and true value can be delivered to the organization. To develop these insights, robust search capability must be built into Big Data applications from the beginning, not added on as an after-thought.

LucidWorks, the trusted name in Search, Discovery and Analytics, today announced the general availability of LucidWorks Big Data™, an application development platform that integrates search capabilities into the foundational layer of Big Data implementations. Built on a foundation of key Apache open source projects, LucidWorks Big Data enables organizations to quickly uncover, access and evaluate large volumes of previously dark data in order to make more informed, better business decisions. Using LucidWorks Big Data, organizations have been able to attain insights that were previously locked away in their vast data stores, honing their competitive edge and slashing the time it takes to meet their organizations' goals.

LucidWorks Big Data Makes an Impact

The release of LucidWorks Big Data follows a comprehensive and highly collaborative beta program through which the product's integrations, scalability, usability and APIs were rigorously tested.

"Computing for Disasters is an initiative that has the potential to revolutionize our nation's preparedness and resilience in the face of future disasters by adopting a computational perspective to fundamental scientific, engineering, and social barriers to disaster management and related research.  To collect, manage, and analyze a diverse range of data sources takes a comprehensive big data architecture that offers a powerful search engine at its core.  We made the decision to take advantage of the LucidWorks Big Data platform because it offers key capabilities in a tightly integrated solution."
 - Dr. Edward Fox, Professor, Virginia Tech Department of Computer Science

"Bright Planet is a pioneer in Deep Web Intelligence, offering our customers the ability to perform deep harvesting of public information that lies beneath the surface of the web.  Our customers span across both public and private sectors.  We made the decision to work with the LucidWorks Big Data platform because of its ability to seamlessly and quickly gather large amounts of information from a variety of different sources (in addition to the web) – and then offer it to our patented deep harvesting search technology. We believe that the combined capability will offer valued solutions to organizations across both private and public sectors."
 - Steve Pederson, CEO and Chairman, BrightPlanet Corporation

"At OpenSource Connections, we spend a lot of time building infrastructure around Solr in order to continually enhance enterprise search capabilities. LucidWorks has all of that right out-of-the-box. One of the killer features for me is the ability to use any of the LucidWorks Search connectors to ingest data into a cluster, perform analytics on it, and use those results to improve search and data discovery. I've spent months building that sort of thing from scratch, and in LucidWorks Big Data I can do it in about five service calls."
 - Scott Stults, Founder and Solutions Architect, OpenSource Connections

With the general availability of LucidWorks Big Data, organizations can now utilize a single platform for their Big Data search, discovery and analytics needs. Designed to be ready out-of-the-box, LucidWorks Big Data is the industry's only solution that combines the power of multiple Apache open source projects, including Hadoop, Mahout, Hive and Lucene/Solr, to provide search, machine learning, recommendation engines and analytics for structured and unstructured content in one complete solution available in the cloud, on premise or as a hybrid solution.

The LucidWorks Big Data platform includes all of the necessary open source components, pre-integrated and certified, as indicated in this diagram. LucidWorks equips technologists and business users with the ability to initially pilot Big Data projects on premise or in the cloud. This means that organizations can avoid the staggering overhead costs and long lead times associated with infrastructure and application development lifecycles while assessing product fit.

LucidWorks Big Data is the only complete development platform that includes:

  • A unified development platform for developing Big Data applications
  • A certified and tightly integrated open source stack:  Hadoop, Lucene/Solr, Mahout, NLP, Hive
  • Single uniform REST API
  • Out-of-the-box provisioning – cloud or on premise
  • Pre-tuned software by open source industry experts

"Working closely with our beta customers, we've witnessed the significant business value that they've achieved through their LucidWorks Big Data projects," said Paul Doscher, president and CEO of LucidWorks. "LucidWorks Big Data helps companies leap forward by uncovering trends and insights they never would have been able to leverage previously. Whether it's growing revenue, expanding into new markets or increasing customer satisfaction, LucidWorks Big Data helps companies achieve their business goals by extracting, analyzing and quickly acting on critical operational information from their ever-compounding collection of data."

LucidWorks Big Data will be available for download by mid-December.  To sign up for notification, visit http://www.lucidworks.com/products/lucidworks-big-data.  To learn more about LucidWorks Big Data, please visit www.lucidworks.com, email [email protected] or call (650) 353-4057.

Resources

About LucidWorks (Formerly Lucid Imagination)

LucidWorks is the only company that delivers enterprise-grade search development platforms built on the power of Apache Lucene/Solr open source search. Out of the 37 Core Committers to the Apache Lucene/Solr project, eight individuals work for LucidWorks, making the company the largest supporter of open source search in the industry. Customers include AT&T, Sears, Ford, Verizon, Cisco, Zappos, Raytheon, The Guardian, The Smithsonian Institution, Salesforce.com, The Motley Fool, Qualcomm, Taser, eHarmony and many other household names around the world. LucidWorks' investors include Shasta Ventures, Granite Ventures, Walden International and In-Q-Tel. Learn more about the company at www.lucidworks.com.

SOURCE LucidWorks

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@CloudExpo Stories
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the...
The OpenStack cloud operating system includes Trove, a database abstraction layer. Rather than applications connecting directly to a specific type of database, they connect to Trove, which in turn connects to one or more specific databases. One target database is Postgres Plus Cloud Database, which includes its own RESTful API. Trove was originally developed around MySQL, whose interfaces are significantly less complicated than those of the Postgres cloud database. In his session at 16th Cloud...
Over the years, a variety of methodologies have emerged in order to overcome the challenges related to project constraints. The successful use of each methodology seems highly context-dependent. However, communication seems to be the common denominator of the many challenges that project management methodologies intend to resolve. In this respect, Information and Communication Technologies (ICTs) can be viewed as powerful tools for managing projects. Few research papers have focused on the way...
As the world moves from DevOps to NoOps, application deployment to the cloud ought to become a lot simpler. However, applications have been architected with a much tighter coupling than it needs to be which makes deployment in different environments and migration between them harder. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, Netflix and so on is at the heart of CloudFoundry – a complete developer-oriented Platform as a Service (PaaS...
SAP is delivering break-through innovation combined with fantastic user experience powered by the market-leading in-memory technology, SAP HANA. In his General Session at 15th Cloud Expo, Thorsten Leiduck, VP ISVs & Digital Commerce, SAP, discussed how SAP and partners provide cloud and hybrid cloud solutions as well as real-time Big Data offerings that help companies of all sizes and industries run better. SAP launched an application challenge to award the most innovative SAP HANA and SAP HANA...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading in...
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. The DevOps Summit at Cloud Expo – to be held June 3-5, 2015, at the Javits Center in New York City – will expand the DevOps community, enable a wide...
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect...
There is no question that the cloud is where businesses want to host data. Until recently hypervisor virtualization was the most widely used method in cloud computing. Recently virtual containers have been gaining in popularity, and for good reason. In the debate between virtual machines and containers, the latter have been seen as the new kid on the block – and like other emerging technology have had some initial shortcomings. However, the container space has evolved drastically since coming on...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Ar...
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
Cloud Expo, Inc. has announced today that Andi Mann returns to DevOps Summit 2015 as Conference Chair. The 4th International DevOps Summit will take place on June 9-11, 2015, at the Javits Center in New York City. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team at ...
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enter...
The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, a...
In a recent research, analyst firm IDC found that the average cost of a critical application failure is $500,000 to $1 million per hour and the average total cost of unplanned application downtime is $1.25 billion to $2.5 billion per year for Fortune 1000 companies. In addition to the findings on the cost of the downtime, the research also highlighted best practices for development, testing, application support, infrastructure, and operations teams.
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...
T-Mobile has been transforming the wireless industry with its “Uncarrier” initiatives. Today as T-Mobile’s IT organization works to transform itself in a like manner, technical foundations built over the last couple of years are now key to their drive for more Agile delivery practices. In his session at DevOps Summit, Martin Krienke, Sr Development Manager at T-Mobile, will discuss where they started their Continuous Delivery journey, where they are today, and where they are going in an effort ...
Enterprises are fast realizing the importance of integrating SaaS/Cloud applications, API and on-premises data and processes, to unleash hidden value. This webinar explores how managers can use a Microservice-centric approach to aggressively tackle the unexpected new integration challenges posed by proliferation of cloud, mobile, social and big data projects. Industry analyst and SOA expert Jason Bloomberg will strip away the hype from microservices, and clearly identify their advantages and d...
Container frameworks, such as Docker, provide a variety of benefits, including density of deployment across infrastructure, convenience for application developers to push updates with low operational hand-holding, and a fairly well-defined deployment workflow that can be orchestrated. Container frameworks also enable a DevOps approach to application development by cleanly separating concerns between operations and development teams. But running multi-container, multi-server apps with containers ...
Software Development Solution category in The 2015 American Business Awards, and will ultimately be a Gold, Silver, or Bronze Stevie® Award winner in the program. More than 3,300 nominations from organizations of all sizes and in virtually every industry were submitted this year for consideration. "We are honored to be recognized as a leader in the software development industry by the Stevie Awards judges," said Steve Brodie, CEO of Electric Cloud. "We introduced ElectricFlow and our Deploy app...