Making Apache Hadoop Less Retro: Bringing Standards to Big Data
Ten short years ago, Apache Hadoop was just a small project deployed on a few
machines at Yahoo and within a few years, it had truly become the backbone of
Yahoo's data infrastructure. Additionally, the current Apache Hadoop market
is forecasted to surpass $16 billion by 2020.
This might lead you to believe that Apache Hadoop is currently the backbone
of data infrastructures for all enterprises; however, widespread enterprise
adoption has been shockingly low.
While the platform is a key technology for gaining business insights from
organizational Big Data, its penetration into enterprises has not lived up to
Hadoop's game-changing business potential. In fact, according to Gartner,
"Despite considerable hype and reported successes for early adopters, 54
percent of survey respondents report no plans to inv... (more)
Apache Hadoop Is Retro: Unlocking Business Value
Apache Hadoop is a key technology for gaining business insights from your Big
Data, but the penetration into enterprises is shockingly low. In fact, Apache
Hadoop and Big Data proponents recognize that this technology has not yet
achieved its game-changing business potential.
In his session at 19th Cloud Expo, John Mertic, director of program
management for ODPi at The Linux Foundation, will explain why this is, how we
can work together as an open data community to increase adoption, and the
importance of open source-based Big Dat... (more)
The speed at which data is generated, consumed, processed, and analyzed is
increasing at an unbelievably rapid pace. Social media, the Internet of
Things, ad tech, and gaming verticals are struggling to deal with the
disproportionate size of data sets. These industries demand data processing
and analysis in near real-time. Traditional Big Data-styled frameworks such
as Apache Hadoop are not well-suited for these use cases.
As a result, multiple open source projects have been started in the last few
years to deal with the streaming data. All were designed to
process a never-endin... (more)
Hadoop in the Cloud
Traditional on-premises data centers have long been the domain of modern data
platforms like Apache Hadoop, meaning companies who build their business on
public cloud were challenged to run Big Data processing and analytics at
scale. But recent advancements in Hadoop performance, security, and most
importantly cloud-native integrations, are giving organizations the ability
to truly gain value from all their data.
In his session at 19th Cloud Expo, David Tishgart, Director of Product
Marketing at Cloudera, will cover the ins and outs of Hadoop, and how it can ... (more)
How to Use Kibana 4 for Log Analysis
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of
Product at Logz.io, will explore the value of Kibana 4 for log analysis and
will give a real live, hands-on tutorial on how to set up Kibana 4 and get
the most out of Apache log files.
He will examine three use cases: IT operations, business intelligence, and
security and compliance. This is a hands-on session that will require
participants to bring their own laptops, and we will provide the rest.
Asaf Yigal is co-founder and VP of Product at log analytics soft... (more)
(SYS-CON Media) - Whether you're a company of one or 100, managing knowledge
is a core concern and implementing a knowledge base is a sensible way to
capture your content. Dokuwiki is a practical open source Web application for
creating a knowledge base that's easy for novice Webmasters to set up but
flexible and full-featured.
The Dokuwiki Web site (www.splitbrain.org/projects/dokuwiki) describes the
Dokuwiki as "a simple to use wiki aimed at a small company's documentation
needs. It works on plain text files and thus needs no database. It has a
simple but powerful syntax which ... (more)