Welcome!

@CloudExpo Authors: Liz McMillan, Zakia Bouachraoui, Yeshim Deniz, Pat Romanski, Elizabeth White

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, Machine Learning , PHP

@CloudExpo: Article

BitYota Start-Up Launches Data-Warehouse-as-a-Service

BitYota was one of eight start-ups hand-selected by Amazon to be showcased

Another one of Andreessen Horowitz's seemingly endless supply of gold-tinged start-ups whose future is supposed to be theirs to lose has come out of start-up hiding to proclaim the existence of its newfangled SaaS-style data-warehouse-as-a-service for Big Data analytics running initially on Amazon.

It's called BitYota - a name derived from the combination of the smallest element of computer storage and the yotta, the biggest number, or 1024 - and besides Andreessen Horowitz BitYota's $12 million in seed and Series A funding comes from Globespan Capital, the Social+Capital Partnership, Dawn Capital, Crosslink Capital, Morado Ventures and individual investors like Yahoo founder Jerry Yang; Yahoo board member and former eBay COO, angel investor Maynard Webb; investor and financial writer Graham Summers; and MapReduce BitYota-like start-up ClearStory Data founder and Marc Andreessen buddy Sharmila Mulligan, who's evidently spreading her bets on Big Data analytics.

BitYota was one of eight start-ups hand-selected by Amazon to be showcased at re:Invent, the Amazon Web Service (AWS) inaugural customer and partner event in Las Vegas this week, where AWS happened to announce Redshift, its own strategic data warehouse service in the cloud currently in closed beta.

It's unclear if that surprised BitYota, which has garnered six beta customers in the last few months including an ad tech, an edu tech and a mobile house. All it can really do is characterize Amazon's move as validation.

The start-up says its widgetry is designed to run on cloud infrastructure without compromising functionality or scalability. It plans to expand its service to other public clouds besides Amazon in the near future and with Redshift in the offing guess it'll have to.

BitYota dramatically claims its SaaS platform frees Big Data "from the shackles of Big Costs and Big Headaches" and makes its analytics solution accessible in the fastest, most affordable way to the most users, with no compromise on functionality or service levels.

Its attributes include one-click data integration, fast analytics at scale and accessibility using familiar tools.

It was built from the ground up around its own shared-nothing massively parallel relational database and is supposed to unite the feature-rich functionality of a full-scale data warehouse with the flexibility and cost-effectiveness of Amazon's cloud infrastructure so any company can unleash the value of their data to gain insights and make better business decisions.

BitYota figures it's got a fundamentally improved approach for those looking to harness the power of Big Data analytics. Key features include:

  • Affordable SaaS: Customers no longer have headaches about hardware provisioning or maintaining homogeneous configurations; software versions, upgrades and installations; database administrators manage and maintain the throughput of workloads; advance planning for growth and burst needs; and concurrent users.
  • Click Data Integration: Customers can load data from data sources such as S3, NoSQL stores and RDBMs and combine all their data including event logs, user profiles and transactions in a single place in the cloud. BitYota auto-detects varying data formats and schemas; understands the rate of arrival/change of new data and seamlessly loads it on a pre-determined schedule.
  • Fast Analytics at Scale using Familiar Tools: BitYota's patent-pending massive parallel analytics engine enables analysis over fresh, fully detailed data. It works in the existing ecosystem of programming languages and BI tools; supports SQL92 natively and integrates via an industry-standard ODBC API with popular business intelligence tools and dashboards to visualize results. Access is via industry-standard languages like JavaScript, Perl, and Python and tools that analyze data without any new or intermediary language skills.

"Companies across every spectrum have an undeniable need to use data to unlock new sources of economic value but relatively few companies have invested sufficient time, money or people to do this right," BitYota CEO Dev Patel, who worked on Hadoop at Yahoo, said. "We believe that data and analytics should be broadly accessible to everyone inside the company and it shouldn't take a fortune to analyze data. I'm proud to say that today BitYota launched a data warehouse-as-a-service that allows just that. We are a cost-effective, self-managing and intuitive service that empowers users to work with their own data in familiar ways, with little retooling or disruption."

The starter widgetry costs $1,500 a month for five users starting with 10 EC2 compute unit and 500GB. It's capable of five concurrent queries. There's a week's free trial on a smaller configuration at www.bityota.com/pricing/.

The market for Big Data technology and services is supposed to grow from $3.2 billion in 2010 to $16.9 billion in 2015, at a 40% CAGR. The forecast is split between 34.2% software, 27.3% servers and 61.4% storage.

BitYota's addressable target market is supposed to be a portion of the $5.8 billion operating on public clouds or roughly $3 billion.

It has visions of being to data warehousing what Salesforce has been to customer relationship management.

However, its immediate market would be companies with relatively small data warehouses, or folks already generating their data on Amazon cloud, if Amazon doesn't gobble them all up.

The company, which so far has 15 people, was founded in 2011 by executives and senior engineers with Big Data experience from Yahoo, Oracle, Veritas/Symantec, Informix, BMC, Kabira/Tibco and Twitter.

One of its founders, chief of cloud services Soren Riise, was an honest-to-God rocket scientist at the European Space Agency. Another co-founder, CTO Harmeek Bedi, used to be a lead database architect at Oracle and Informix after IBM bought it.

More Stories By Maureen O'Gara

Maureen O'Gara the most read technology reporter for the past 20 years, is the Cloud Computing and Virtualization News Desk editor of SYS-CON Media. She is the publisher of famous "Billygrams" and the editor-in-chief of "Client/Server News" for more than a decade. One of the most respected technology reporters in the business, Maureen can be reached by email at maureen(at)sys-con.com or paperboy(at)g2news.com, and by phone at 516 759-7025. Twitter: @MaureenOGara

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will detail these pain points and explain how cloud can address them.
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-centric compute for the most data-intensive applications. Hyperconverged systems already in place can be revitalized with vendor-agnostic, PCIe-deployed, disaggregated approach to composable, maximizing the value of previous investments.
When building large, cloud-based applications that operate at a high scale, it's important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. "Fly two mistakes high" is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed how this same philosophy can be applied to highly scaled applications, and can dramatically increase your resilience to failure.
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by sharing information within the building and with outside city infrastructure via real time shared cloud capabilities.
As Cybric's Chief Technology Officer, Mike D. Kail is responsible for the strategic vision and technical direction of the platform. Prior to founding Cybric, Mike was Yahoo's CIO and SVP of Infrastructure, where he led the IT and Data Center functions for the company. He has more than 24 years of IT Operations experience with a focus on highly-scalable architectures.