[Get Udemy Courses For Free] – HDPCD – Spark using Scala
HDPCD:Spark using Scala
What Will I Learn?
- Learn Scala, Spark, HDFS etc for the preparation of HDPCD Spark certification
- Basic programming skills
- Hortonworks Sandbox or valid account for IT Versity Big Data labs or any Hadoop clusters where Hadoop, Hive and Spark are well integrated.
- Minimum memory required based on the environment you are using with 64 bit operating system
Course cover the overall syllabus of HDPCD:Spark Certification.
- Scala Fundamentals – Basic Scala programming required using REPL
- Getting Started with Spark – Different setup options, setup process
- Core Spark – Transformations and Actions to process the data
- Data Frames and Spark SQL – Leverage SQL skills on top of Data Frames created from Hive tables or RDD
- One month complementary lab access
- Exercises – A set of self evaluated exercises to test skills for certification purpose
After the course one will gain enough confidence to give the certification and crack it.
All the demos are given on our state of the art Big Data cluster. You can avail one month complementary lab access by filling this form which is provided as part of the welcome message.
Who is the target audience?
- Any one who want to prepare for HDPCD Spark Certification using Scala
udemy $15 coupon
udemy aws coupon
udemy discount coupon code
udemy coupon discount global
udemy 90 off coupon code