A key Role in the Initial Growth of the World Wide Web

Apache Web Server Journal

Subscribe to Apache Web Server Journal: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get Apache Web Server Journal: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Top Stories

Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze real-time and historical data together. In his session at Big Data Expo, Dale Kim, Director of Industry Solutions at MapR, covered the practice of capturing canonical data "as it lands" as a baseline for accommodating future analytics requirements. Download Slide Deck: ▸ Here Spe... (more)

[slides] #Hadoop in the #Cloud | @CloudExpo @Cloudera #BigData #DataCenter

Download Slide Deck: ▸ Here Download Slide Deck: ▸ Here Hadoop in the Cloud Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. Download Slide Deck: ▸ Here In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing at Cloudera, covered the ins and outs of Hadoop, and how it can help cloud-based businesses. Attendees will learn: How to speed ETL pipelines and data transformations How to run low-latency ad-hoc analytics across regions against object store data How to perform high-va... (more)

‘The Pain Curve’ By @StackIQ | @CloudExpo #BigData #DevOps #Hadoop

Download Slide Deck: ▸ Here Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ, described why clusters are so different from farms of single-purpose servers that reside in traditional data centers, and why without an automated solution that can address the cluster requirements, real pain is coming and failure is certain. Speaker Bio: Greg Bruno is the Vice President of Engineering and co-founder of StackIQ. Prior to jo... (more)

Kibana 4 for Log Analysis | @DevOpsSummit @Logzio #DevOps #ML #Elasticsearch

How to Use Kibana 4 for Log Analysis In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at Logz.io, will explore the value of Kibana 4 for log analysis and will give a real live, hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He will examine three use cases: IT operations, business intelligence, and security and compliance. This is a hands-on session that will require participants to bring their own laptops, and we will provide the rest. Speaker Bio Asaf Yigal is co-founder and VP of Product at log analytics software company Logz.io. In the past, he was co-founder of social-trading platform Currensee, which was later acquired by OANDA. He was also an early employee of server performance-monitoring company Akorri and storage resource-management startup Onaro, both of which were acquired by NetApp (NT... (more)

[session] Apache #Hadoop Is Retro | @BigDataExpo @JMertic #Linux #BigData

Apache Hadoop Is Retro: Unlocking Business Value Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the importance of open source-based Big Data standardization to help unlock more business value for Apache Hadoop initiatives. Speaker Bio John Mertic is Director of Program Management for ODPi and Open Mainframe Project at The Linux Foundation. Previously, he was director of business development software alliances at Bitnami. He come... (more)

Dokuwiki - A Practical Open Source Knowledge Base Solution

(SYS-CON Media) - Whether you're a company of one or 100, managing knowledge is a core concern and implementing a knowledge base is a sensible way to capture your content. Dokuwiki is a practical open source Web application for creating a knowledge base that's easy for novice Webmasters to set up but flexible and full-featured. The Dokuwiki Web site (www.splitbrain.org/projects/dokuwiki) describes the Dokuwiki as "a simple to use wiki aimed at a small company's documentation needs. It works on plain text files and thus needs no database. It has a simple but powerful syntax which makes sure the data files remain readable outside the wiki." Dokuwiki runs on a variety of Web servers, including Apache and IIS and requires PHP 4.3.x or higher. If you do not have your own Web server, you can install Dokuwiki on a hosted Web site, as long the Web host includes PHP access. ... (more)

[session] @Cloudera to Present at @CloudExpo | #BigData #IoT #DataCenter

Hadoop in the Cloud Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing at Cloudera, will cover the ins and outs of Hadoop, and how it can help cloud-based businesses. Attendees will learn: How to speed ETL pipelines and data transformations How to run low-latency ad-hoc analytics across regions against object store data How to perform high-value analytical workloads that enhance customer insights, improve product and services... (more)

The Top 150 Players in Cloud Computing

Cloud Expo Early Bird Savings A robust ecosystem of solutions providers is emerging around cloud computing. Here, SYS-CON's Cloud Computing Journal expands its list of most active players in the fast-emerging Cloud Ecosystem, from the 'mere' 100 we identified back in January of this year, to half as many again - testimony, if any further were needed, to the fierce and continuing growth of the "Elastic IT" paradigm throughout the world of enterprise computing. Editorial note: The words in quotation marks used to describe the various services and solutions in this round-up are in every case taken from the Web sites cited. As ever we encourage software engineers, developers, IT operations managers, and new/growing companies in every case to "suck it and see" by downloading or otherwise sampling the offering in question for themselves. (Omissions to this Top 150 list sh... (more)

Securing and Authenticating SOAP-Based Web Services

If you are implementing a multiuser system, your system will probably have certain attributes. It may be implemented in a distributed fashion and it may have some sort of security model. In its most basic form, such a system can be represented by a straight line on a piece of paper: below the line is the information, content, data (call it what you will); and above the line are the various individuals, groups, and roles that need to work with what is below the line. We connect clients above the line to the data below it by exposing services that provide access through the line. These services describe the operations that may be performed upon the data. The security model implemented within the service layer determines who may perform those operations and upon which particular bits of data. To the extent that these services are the only way to access the data, our l... (more)

An Introduction to Ant

Writing shell scripts to automate the build and deploy process for ColdFusion applications is not very much fun. The Jakarta Ant project is an open-source, cross-platform alternative that makes it easy to automate the build and deploy process. But My Build and Deploy Process is Fine.... Maybe your build and deploy process for your latest application is fine - you type a single command and your build process automatically retrieves your application from the source control system, configures the application appropriately for the target environment, and copies all the necessary files to the production servers while you head to the coffee shop for your morning cup of caffeine and the newspaper. But I know that the reality for the vast majority of projects I've seen (including many of my own applications!) are built and deployed using a written multistep checklist - some ... (more)

The i-Technology Right Stuff

Related Links: Wanted: 19 More of the Top Software People in the World Sung and Unsung i-Technology Heroes Who's Missing from SYS-CON's i-Technology Top Twenty?" Our search for the Twenty Top Software People in the World is nearing completion. In the SYS-CON tradition of empowering readers, we are leaving the final "cut" to you, so here are the top 40 nominations in alphabetical order. Our aim this time round is to whittle this 40 down to our final twenty, not (yet) to arrange those twenty in any order of preference. All you need to do to vote is to go to the Further Details page of any nominee you'd like to see end up in the top half of the poll when we close voting on Christmas Eve, December 24, and cast your vote or votes. To access the Further Details of each nominee just click on their name. Happy voting!   In alphabetical order the nominees are:   Tim Berner... (more)