Welcome!

Cloud Expo Authors: Roger Strukhoff, Pat Romanski, Elizabeth White, Trevor Parsons, Michael Bushong

News Feed Item

LucidWorks Big Data Now Available

Successful Beta Program Fuels Industry's First Integrated Application Development Platform for Big Data Killer Apps

REDWOOD CITY, Calif., Nov. 15, 2012 /PRNewswire/ -- Many organizations developing Big Data applications first focus on trying to control, manage and secure the burgeoning volumes of data being generated. However, that step just scratches the surface of unlocking the data's ability to power competitive advantage. It is only when these multi-structured data stores become accessible to search and discovery that the information is transformed into corporate assets from which business insights can be derived and true value can be delivered to the organization. To develop these insights, robust search capability must be built into Big Data applications from the beginning, not added on as an after-thought.

LucidWorks, the trusted name in Search, Discovery and Analytics, today announced the general availability of LucidWorks Big Data™, an application development platform that integrates search capabilities into the foundational layer of Big Data implementations. Built on a foundation of key Apache open source projects, LucidWorks Big Data enables organizations to quickly uncover, access and evaluate large volumes of previously dark data in order to make more informed, better business decisions. Using LucidWorks Big Data, organizations have been able to attain insights that were previously locked away in their vast data stores, honing their competitive edge and slashing the time it takes to meet their organizations' goals.

LucidWorks Big Data Makes an Impact

The release of LucidWorks Big Data follows a comprehensive and highly collaborative beta program through which the product's integrations, scalability, usability and APIs were rigorously tested.

"Computing for Disasters is an initiative that has the potential to revolutionize our nation's preparedness and resilience in the face of future disasters by adopting a computational perspective to fundamental scientific, engineering, and social barriers to disaster management and related research.  To collect, manage, and analyze a diverse range of data sources takes a comprehensive big data architecture that offers a powerful search engine at its core.  We made the decision to take advantage of the LucidWorks Big Data platform because it offers key capabilities in a tightly integrated solution."
 - Dr. Edward Fox, Professor, Virginia Tech Department of Computer Science

"Bright Planet is a pioneer in Deep Web Intelligence, offering our customers the ability to perform deep harvesting of public information that lies beneath the surface of the web.  Our customers span across both public and private sectors.  We made the decision to work with the LucidWorks Big Data platform because of its ability to seamlessly and quickly gather large amounts of information from a variety of different sources (in addition to the web) – and then offer it to our patented deep harvesting search technology. We believe that the combined capability will offer valued solutions to organizations across both private and public sectors."
 - Steve Pederson, CEO and Chairman, BrightPlanet Corporation

"At OpenSource Connections, we spend a lot of time building infrastructure around Solr in order to continually enhance enterprise search capabilities. LucidWorks has all of that right out-of-the-box. One of the killer features for me is the ability to use any of the LucidWorks Search connectors to ingest data into a cluster, perform analytics on it, and use those results to improve search and data discovery. I've spent months building that sort of thing from scratch, and in LucidWorks Big Data I can do it in about five service calls."
 - Scott Stults, Founder and Solutions Architect, OpenSource Connections

With the general availability of LucidWorks Big Data, organizations can now utilize a single platform for their Big Data search, discovery and analytics needs. Designed to be ready out-of-the-box, LucidWorks Big Data is the industry's only solution that combines the power of multiple Apache open source projects, including Hadoop, Mahout, Hive and Lucene/Solr, to provide search, machine learning, recommendation engines and analytics for structured and unstructured content in one complete solution available in the cloud, on premise or as a hybrid solution.

The LucidWorks Big Data platform includes all of the necessary open source components, pre-integrated and certified, as indicated in this diagram. LucidWorks equips technologists and business users with the ability to initially pilot Big Data projects on premise or in the cloud. This means that organizations can avoid the staggering overhead costs and long lead times associated with infrastructure and application development lifecycles while assessing product fit.

LucidWorks Big Data is the only complete development platform that includes:

  • A unified development platform for developing Big Data applications
  • A certified and tightly integrated open source stack:  Hadoop, Lucene/Solr, Mahout, NLP, Hive
  • Single uniform REST API
  • Out-of-the-box provisioning – cloud or on premise
  • Pre-tuned software by open source industry experts

"Working closely with our beta customers, we've witnessed the significant business value that they've achieved through their LucidWorks Big Data projects," said Paul Doscher, president and CEO of LucidWorks. "LucidWorks Big Data helps companies leap forward by uncovering trends and insights they never would have been able to leverage previously. Whether it's growing revenue, expanding into new markets or increasing customer satisfaction, LucidWorks Big Data helps companies achieve their business goals by extracting, analyzing and quickly acting on critical operational information from their ever-compounding collection of data."

LucidWorks Big Data will be available for download by mid-December.  To sign up for notification, visit http://www.lucidworks.com/products/lucidworks-big-data.  To learn more about LucidWorks Big Data, please visit www.lucidworks.com, email info@lucidworks.com or call (650) 353-4057.

Resources

About LucidWorks (Formerly Lucid Imagination)

LucidWorks is the only company that delivers enterprise-grade search development platforms built on the power of Apache Lucene/Solr open source search. Out of the 37 Core Committers to the Apache Lucene/Solr project, eight individuals work for LucidWorks, making the company the largest supporter of open source search in the industry. Customers include AT&T, Sears, Ford, Verizon, Cisco, Zappos, Raytheon, The Guardian, The Smithsonian Institution, Salesforce.com, The Motley Fool, Qualcomm, Taser, eHarmony and many other household names around the world. LucidWorks' investors include Shasta Ventures, Granite Ventures, Walden International and In-Q-Tel. Learn more about the company at www.lucidworks.com.

SOURCE LucidWorks

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Cloud Expo Breaking News
File sync and share. Endpoint protection. Both are massive opportunities for today’s enterprise thanks to their business benefits and widespread user appeal. But one size does not fit all, especially user-adopted consumer technologies. Organizations must apply the right enterprise-ready tool for the job in order to properly manage and protect endpoint data. In his session at 14th Cloud Expo, Michael Bachman, Senior Enterprise Systems Architect at Code42, he will discuss how the synergy of an enterprise platform – where sync/share and endpoint protection converge – delivers incredible value for the business.
Simply defined the SDDC promises that you’ll be able to treat “all” of your IT infrastructure as if it’s completely malleable. That there are no restrictions to how you can use and assign everything from border controls to VM size as long as you stay within the technical capabilities of the devices. The promise is great, but the reality is still a dream for the majority of enterprises. In his session at 14th Cloud Expo, Mark Thiele, EVP, Data Center Tech, at SUPERNAP, will cover where and how a business might benefit from SDDC and also why they should or shouldn’t attempt to adopt today.
Today, developers and business units are leading the charge to cloud computing. The primary driver: faster access to computing resources by using the cloud's automated infrastructure provisioning. However, fast access to infrastructure exposes the next friction point: creating, delivering, and operating applications much faster. In his session at 14th Cloud Expo, Bernard Golden, VP of Strategy at ActiveState, will discuss why solving the next friction point is critical for true cloud computing success and how developers and business units can leverage service catalogs, frameworks, and DevOps to achieve the true goal of IT: delivering increased business value through applications.
APIs came about to help companies create and manage their digital ecosystem, enabling them not only to reach more customers through more devices, but also create a large supporting ecosystem of developers and partners. While Facebook, Twitter and Netflix were the early adopters of APIs, large enterprises have been quick to embrace the concept of APIs and have been leveraging APIs as a connective tissue that powers all interactions between their customers, partners and employees. As enterprises embrace APIs, some very specific Enterprise API Adoption patterns and best practices have started emerging. In his session at 14th Cloud Expo, Sachin Agarwal, VP of Product Marketing and Strategy at SOA Software, will talk about the most common enterprise API patterns and will discuss how enterprises can successfully launch an API program.
MapDB is an Apache-licensed open source database specifically designed for Java developers. The library uses the standard Java Collections API, making it totally natural for Java developers to use and adopt, while scaling database size from GBs to TBs. MapDB is very fast and supports an agile approach to data, allowing developers to construct flexible schemas to exactly match application needs and tune performance, durability and caching for specific requirements.
The social media expansion has shown just how people are eager to share their experiences with the rest of the world. Cloud technology is the perfect platform to satisfy this need given its great flexibility and readiness. At Cynny, we aim to revolutionize how people share and organize their digital life through a brand new cloud service, starting from infrastructure to the users’ interface. A revolution that began from inventing and designing our very own infrastructure: we have created the first server network powered solely by ARM CPU. The microservers have “organism-like” features, differentiating them from any of the current technologies. Benefits include low consumption of energy, making Cynny the ecologically friendly alternative for storage as well as cheaper infrastructure, lower running costs, etc.
Next-Gen Cloud. Whatever you call it, there’s a higher calling for cloud computing that requires providers to change their spots and move from a commodity mindset to a premium one. Businesses can no longer maintain the status quo that today’s service providers offer. Yes, the continuity, speed, mobility, data access and connectivity are staples of the cloud and always will be. But cloud providers that plan to not only exist tomorrow – but to lead – know that security must be the top priority for the cloud and are delivering it now. In his session at 14th Cloud Expo, Kurt Hagerman, Chief Information Security Officer at FireHost, will detail why and how you can have both infrastructure performance and enterprise-grade security – and what tomorrow's cloud provider will look like.
Web conferencing in a public cloud has the same risks as any other cloud service. If you have ever had concerns over the types of data being shared in your employees’ web conferences, such as IP, financials or customer data, then it’s time to look at web conferencing in a private cloud. In her session at 14th Cloud Expo, Courtney Behrens, Senior Marketing Manager at Brother International, will discuss how issues that had previously been out of your control, like performance, advanced administration and compliance, can now be put back behind your firewall.
More and more enterprises today are doing business by opening up their data and applications through APIs. Though forward-thinking and strategic, exposing APIs also increases the surface area for potential attack by hackers. To benefit from APIs while staying secure, enterprises and security architects need to continue to develop a deep understanding about API security and how it differs from traditional web application security or mobile application security. In his session at 14th Cloud Expo, Sachin Agarwal, VP of Product Marketing and Strategy at SOA Software, will walk you through the various aspects of how an API could be potentially exploited. He will discuss the necessary best practices to secure your data and enterprise applications while continue continuing to support your business’s digital initiatives.
The revolution that happened in the server universe over the past 15 years has resulted in an eco-system that is more open, more democratically innovative and produced better results in technically challenging dimensions like scale. The underpinnings of the revolution were common hardware, standards based APIs (ex. POSIX) and a strict adherence to layering and isolation between applications, daemons and kernel drivers/modules which allowed multiple types of development happen in parallel without hindering others. Put simply, today's server model is built on a consistent x86 platform with few surprises in its core components. A kernel abstracts away the platform, so that applications and daemons are decoupled from the hardware. In contrast, networking equipment is still stuck in the mainframe era. Today, networking equipment is a single appliance, including hardware, OS, applications and user interface come as a monolithic entity from a single vendor. Switching between different vendor'...
Cloud backup and recovery services are critical to safeguarding an organization’s data and ensuring business continuity when technical failures and outages occur. With so many choices, how do you find the right provider for your specific needs? In his session at 14th Cloud Expo, Daniel Jacobson, Technology Manager at BUMI, will outline the key factors including backup configurations, proactive monitoring, data restoration, disaster recovery drills, security, compliance and data center resources. Aside from the technical considerations, the secret sauce in identifying the best vendor is the level of focus, expertise and specialization of their engineering team and support group, and how they monitor your day-to-day backups, provide recommendations, and guide you through restores when necessary.
Cloud scalability and performance should be at the heart of every successful Internet venture. The infrastructure needs to be resilient, flexible, and fast – it’s best not to get caught thinking about architecture until the middle of an emergency, when it's too late. In his interactive, no-holds-barred session at 14th Cloud Expo, Phil Jackson, Development Community Advocate for SoftLayer, will dive into how to design and build-out the right cloud infrastructure.
You use an agile process; your goal is to make your organization more agile. What about your data infrastructure? The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver on new features and capabilities needed to make your organization competitive. As your application and business needs change, data repositories and structures get outmoded rapidly, resulting in increased work for application developers and slow performance for end users. Further, as data sizes grow into the Big Data realm, this problem is exacerbated and becomes even more difficult to address. A seemingly simple schema change can take hours (or more) to perform, and as requirements evolve the disconnect between existing data structures and actual needs diverge.
SYS-CON Events announced today that SherWeb, a long-time leading provider of cloud services and Microsoft's 2013 World Hosting Partner of the Year, will exhibit at SYS-CON's 14th International Cloud Expo®, which will take place on June 10–12, 2014, at the Javits Center in New York City, New York. A worldwide hosted services leader ranking in the prestigious North American Deloitte Technology Fast 500TM, and Microsoft's 2013 World Hosting Partner of the Year, SherWeb provides competitive cloud solutions to businesses and partners around the world. Founded in 1998, SherWeb is a privately owned company headquartered in Quebec, Canada. Its service portfolio includes Microsoft Exchange, SharePoint, Lync, Dynamics CRM and more.
The world of cloud and application development is not just for the hardened developer these days. In their session at 14th Cloud Expo, Phil Jackson, Development Community Advocate for SoftLayer, and Harold Hannon, Sr. Software Architect at SoftLayer, will pull back the curtain of the architecture of a fun demo application purpose-built for the cloud. They will focus on demonstrating how they leveraged compute, storage, messaging, and other cloud elements hosted at SoftLayer to lower the effort and difficulty of putting together a useful application. This will be an active demonstration and review of simple command-line tools and resources, so don’t be afraid if you are not a seasoned developer.