- Apache Spark Tutorial
- Apache Spark - Home
- Apache Spark - Introduction
- Apache Spark - RDD
- Apache Spark - Installation
- Apache Spark - Core Programming
- Apache Spark - Deployment
- Advanced Spark Programming
- Apache Spark Useful Resources
- Apache Spark - Quick Guide
- Apache Spark - Useful Resources
- Apache Spark - Discussion
- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who
Apache Spark Tutorial
Apache Spark is a lightning-fast cluster computing designed for fast computation. It was built on top of Hadoop MapReduce and it extends the MapReduce model to efficiently use more types of computations which includes Interactive Queries and Stream Processing. This is a brief tutorial that explains the basics of Spark Core programming.
Audience
This tutorial has been prepared for professionals aspiring to learn the basics of Big Data Analytics using Spark Framework and become a Spark Developer. In addition, it would be useful for Analytics Professionals and ETL developers as well.
Prerequisites
Before you start proceeding with this tutorial, we assume that you have prior exposure to Scala programming, database concepts, and any of the Linux operating system flavors.