Hadoop news

Hadoop


Top Stories

DALLAS, Aug. 21, 2014 /PRNewswire-iReach/ -- Amid the proliferation of real time data from sources such as mobile devices, web, social media, sensors, log files and transactional applications, Big Data has found a host of vertical market applications, ranging from fraud detection to R&D. Photo - http://photos.prnewswire.com/prnh/20140821/138541 "Big Data Market: 2014 – 2020 – Opportunities, Challenges, Strategies, Industry Verticals & Forecasts" Key Findings: In 2014 Big Data vendors will pocket nearly $30 Billion from hardware, software and professional services revenues Big Data investments are further expected to grow at a CAGR of nearly 17% over the next 6 years, eventually accounting for $76 Billion by the end of 2020 The market is ripe for acquisitions of pure-play Big Data startups, as competition heats up between IT incumbents Nearly every large scale IT ven... (more)

Cloud Computing and Big Data in 2013: What's Coming Next?

What changes in the cloud computing and big data landscape should we be expecting in 2013? In this article we offer a round-up of industry experts' opinions as they were asked by Cloud Expo / BigDataExpo Conference Chair Jeremy Geelan to preview the fast-approaching year ahead. 2013 Will Be The Year of Big Data  | The Internet of Things | Cloud To The Rescue (DR) | SSD John Engates | @jengates CTO of Rackspace Hosting Now its CTO, John joined Rackspace in August 2000, just a year after the company was founded, as VP of Operations, managing the datacenter operations and customer-service teams. Two years later, when Rackspace decided to add new services for larger enterprise customers, he created and helped develop the Intensive Hosting business unit. Most recently, he has played an active role in the evolution and evangelism of Rackspace's cloud computing strategy an... (more)

Examining the True Cost of Big Data

The good news about the Big Data market is that we generally all agree on the definition of Big Data, which has come to be known as data that has volume, velocity and variety where businesses need to collect, store, manage and analyze in order to derive business value or otherwise known as the "4 V's." However, the problem with such a broad definition is that it can mean different things to different people once you start to put some real values next to those V's. Let's be honest, Volume can be a different thing to different organizations. To some it is anything above 10 terabytes of managed data in their BI environment and to others it is petabyte scale and nothing less. Likewise velocity can be multi-billions of daily records coming into the enterprise from various external and internal networks. When it really comes down to it, each business situation will be qu... (more)

Big Data Top Ten | @CloudExpo [#BigData]

What do you get when you combine Big Data technologies….like Pig and Hive? A flying pig? No, you get a “Logical Data Warehouse”. My general prediction is that Cloudera and Hortonworks are both aggressively moving to fulfilling a vision which looks a lot like Gartner’s “Logical Data Warehouse”….namely, “the next-generation data warehouse that improves agility, enables innovation and responds more efficiently to changing business requirements.” In 2012, Infochimps (now CSC) leveraged its early use of stream processing, NoSQLs, and Hadoop to create a design pattern which combined real-time, ad-hoc, and batch analytics. This concept of combining the best-in-breed Big Data technologies will continue to advance across the industry until the entire legacy (and proprietary) data infrastructure stack will be replaced with a new (and open) one. As this is happening, I predi... (more)

Cousins of Cobol in Big Data Analytics

In this  article  I would  like to look at a few tools which are overlooked when it comes to Big Data analytics. Organizations that  have  already  heavy investment  on Mainframe  and  would like to continue  with the utilization of Mainframe can consider these  tools for further  expanding their Big Data Analytics reach. DFSORT-  Sorting & Merging Large Data Sets : Much before RDBMS have taken their place, Cobol programs have 2 major file manipulation operations namely: SORT operation accepts un-sequenced input and produces output in specified sequence The Merge operation compares records from two or more files and combines them in order DFSORT adds the ability to do faster and easier sorting, merging, copying, reporting and analysis of your business information, as well as versatile data handling at the record, fixed position/length or variable position/length fi... (more)

The Big Data Revolution

For many years, companies collected data from various sources that often found its way into relational databases like Oracle and MySQL. However, the rise of the Internet, Web 2.0, and recently social media began an enormous increase in the amount of data created as well as in the type of data. No longer was data relegated to types that easily fit into standard data fields. Instead, it now came in the form of photos, geographic information, chats, Twitter feeds, and emails. The age of Big Data is upon us. Big Data Beginnings A study by IDC titled "The Digital Universe Decade" projects a 45-fold increase in annual data by 2020. In 2010, the amount of digital information was 1.2 zettabytes (1 zettabyte equals 1 trillion gigabytes). To put that in perspective, the equivalent of 1.2 zettabytes is a full-length episode of "24" running continuously for 125 million years, ac... (more)

Pentaho #1 in Support for Big Data Platforms

SAN DIEGO, CA -- (Marketwire) -- 08/08/11 -- TDWI -- Pentaho, a leading worldwide provider of business intelligence (BI) and data integration software, today reaffirmed its commitment to support Big Data with a major expansion of native Big Data sources, including the latest Hadoop distributions, NoSQL sources, as well as native support for many analytic databases and traditional OLTP databases. Pentaho's native connection to Big Data platforms makes it easier and faster than ever to analyze the enormous data volumes generated by today's organizations. Attend the live webinar: Pentaho Business Intelligence for Hadoop, NoSQL, Analytical Databases and OLTP Databases Pentaho is changing the way that organizations analyze big data. Pentaho's Big Data support enables: Fastest time to solution for Big Data analysis and data integration problems; Optimized performance and... (more)

How Enterprise Big Data Will Affect Organizations in 2012

As the new year begins, global companies face the coming year's most prominent IT and business challenge: Big Data. The focus for IT will be to provide high performance analytics capabilities at the lowest cost, as business users need to tap into volumes of multi-structured data about their customers and markets to gain competitive advantage. RainStor, a provider of Big Data management software, has released five predictions focused on how enterprise Big Data will affect organizations in 2012. Based on client and partner experience, market research and conversations with industry experts, here are RainStor's five predictions for Big Data in 2012: Prediction #1: Big Data will Transition from Technology "Buzz" to a Real Business Challenge Affecting Many Large Global Enterprises Big Data is largely centered on leveraging the open source Apache Hadoop analytics platform... (more)

Big Data Analytics: Thinking Outside of Hadoop

Big Data Predictions In the recent release of '2012 Hype Cycle Of Emerging Technologies,' research analyst Gartner evaluated several technologies to come up with a list of technologies that will dominate the future . "Big Data" related technologies form a significant portion of the list, in particular the following technologies revolve around the concept and usage of Big Data. Social Analytics: This analytics allow marketers to identify sentiment and identify trends in order to accommodate the customer better. Activity Streams: Activity Streams are the future of enterprise collaboration, uniting people, data, and applications in real-time in a central, accessible, virtual interface. Think of a company social network where every employee, system, and business process exchanged up-to-the-minute information about their activities and outcomes Natural Language Question A... (more)

Introducing Big Data

The phrase “Big Data” is thrown around a lot these days. What exactly is referred to by this phrase? When I was part of IBM’s DB2 development team, the largest size limit of a DB2 Table was 64 Gigabytes (GB) and I thought who on earth can use this size of a database. Thirty years later, that number looks so small. Now you can buy a 1 Terabyte external drive for less than $100. Let us start with a level set on the unit of storage. In multiples of 1000, we go from Byte – Kilobyte (KB) – Megabyte (MB) – Gigabyte (GB) – Terabyte (TB) – Petabyte (PB) – Exabyte (EB) – Zettabyte (ZB) – Yottabyte (YB). The last one YB is 10 to the power of 24. A typed page is 2KB. The entire book collection at the US Library of Congress is 15TB. The amount of data processed in one hour at Google is 1PB. The total amount of information in existence is around 1.27ZB. Now you get some context... (more)

Five Big Data Features in SQL Server

Traditional RDBMS & New Data Processing Over the past two decades relational databases have been most successful in serving large scale OLTP and OLAP applications across enterprises. However, in the past couple of years with the advent of Big Data processing, especially for processing unstructured data coupled with the need for processing massive quantities of data, made the industry to look into Non RDBMS solutions. This has lead into the popularity of NOSQL databases as well as massively parallel processing frameworks. However the traditional RDBMS were quick to react and added several Big Data features as part of their offering so that the enterprises with a heavy investment of traditional RDBMS can have the best of both worlds by properly leveraging these new features. The following sections provide ideas about Big Data features in the popular SQL Server databa... (more)