Classroom – $849 + Cluster Price (Call us)
Online – coming soon
COURSE CODE: (DJ-BUILD-STARTUP-BASE)
Weekend Only Classes – 2 hours on Saturday and 3 hours on Sunday. Your class will start on the weekend following the week you register for the Course. For details please call or email us found in the ‘Contact Us’ page.
Venue: 2750 peachtree industrial blvd, suite F, duluth, ga – 30097
About the Course
This is a unique training that no companies are offering in the big data space. You learn how to build a Hadoop or NOSQL cluster literally from bare metal servers. You take home the toys you build with us after training and continue to learn big data faster than others or start your own startup in big data.
Not only we train you on how to build the cluster in this course, but also train you all of the fundamentals components needed to be a Hadoop Developer and Administrator.
- Not typical training, it’s more like a boot camp.
- 3 days POC on production-like cluster VS 3 months POC at your workplace on a VM.
- 45 Hours of trainer’stime with you in person.
- MAX of 5 students per class for more interaction and discussions.
- Face-to-Face training
- End-to-End understanding of big data – Starting from bare-bone machines configuration/installing OS/configuring network/preparing cluster for big data hadoop software installation/Installing Hadoop(Pick your own distro)/Hadoop development course/ Sample Case study.
- 8 quizzes throughout the class.
- 8 hours – Final Exam – We will scratch and give you a barebone machine and you have to repeat all steps you learned.
- Submission of real world case study using Hadoop.
- Introduction to big data.
- Getting to know the importance of storage/processing power.
- Take a peek at the inside of a server.
- Understand what is a cluster and why is it needed?
- Explore your own hardware components – RAM/CPU/DRIVES/NIC cards.
- Powering up the cluster.
- Unix/Linux overview.
- Java overview.
- Installing the Linux Operating system on the cluster
- Pick a distro – Centos/Fedora/RHEL.
- Understand how to partition the drives
- Understanding the difference between minimal desktop and desktop
- Picking up the required softwares before installing.
- Start installing the OS.
- Pick password for root user and assign hostnames for servers.
- Finish installing the OS.
- Configure network for automatic Internet connection.
- Static IP configuration.
- Test the Internet connection.
- Pick a Hadoop Distribution for installation.
- Go over the pre-requisites for any given distro.
- Prepare the cluster according to the pre requisites.
- Once pre-requisites are complete run the pre-req script to verify.
- Install the distribution’s ODP part.
- Use Ambari to further install the distro.
- Install the providers add-ons followed by odp omponents.
- Verify Installation using Ambari.
- Overview of contents from Big data and Hadoop Developer/Administrator Courses.
- Use case design
- Hands on Week2 contents with help from our trainers
- Hands on Week3 contents with help from our trainers
- Break up the cluster and rebuild it again on your own from scratch.
- Remote in and finish your use case.
- Submit your use case.
- Demo of use cases
- Practice more via remote connection on your own cluster.
- Ship the cluster to your location.
- Configure your own Static IPs for your cluster.
- Verify if network is all working fine.
- Verify if remote support tools are installed.
- Bring up your cluster up and running.
- Start the BIG DATA HADOOP components.
- Sample test the components and test Ambari.