Oracle 1Z0-449 Certification Exam Syllabus

1Z0-449 Syllabus, 1Z0-449 Dumps PDF, Oracle OCS Dumps, 1Z0-449 Dumps Free Download PDF, Oracle Big Data X4-2 OCS Dumps, 1Z0-449 Latest Dumps Free DownloadYou can use this exam guide to collect all the information about Oracle Big Data 2017 Implementation Essentials (1Z0-449) certification. The Oracle 1Z0-449 certification is mainly targeted to those candidates who has some experience or exposure of Oracle Big Data X4-2 and want to flourish their career with Oracle Big Data 2017 Certification Implementation Specialist (OCS) credential. The Oracle Big Data 2017 Implementation Essentials certification exam validates your understanding of the Oracle Big Data X4-2 technology and sets the stage for your future progression. Your preparation plan for Oracle 1Z0-449 Certification exam should include hands-on practice or on-the-job experience performing the tasks described in following Certification Exam Topics table.

Oracle 1Z0-449 Certification Details:

Exam Name Oracle Big Data 2017 Implementation Essentials
Exam Code 1Z0-449
Exam Product Version Oracle Big Data X4-2
Exam Price USD $245 (Pricing may vary by country or by localized currency)
Duration 120
Number of Questions 72
Passing Score 67
Format Multiple Choice
Recommended Training Oracle Big Data 2016 Implementation Specialist
Schedule Exam Pearson VUE - Oracle
Recommended Practice 1Z0-449 Online Practice Exam

Oracle 1Z0-449 Certification Topics:

Big Data Technical Overview - Describe the architectural components of the Big Data Appliance
- Describe how Big Data Appliance integrates with Exadata and Exalytics
- Identify and architect the services that run on each node in the Big Data Appliance, as it expands from single to multiple nodes
- Describe the Big Data Discovery and Big Data Spatial and Graph solutions
- Explain the business drivers behind Big Data and NoSQL versus Hadoop
Core Hadoop - Explain the Hadoop Ecosystem
- Implement the Hadoop Distributed File System
- Identify the benefits of the Hadoop Distributed File System (HDFS)
- Describe the architectural components of MapReduce
- Describe the differences between MapReduce and YARN
- Describe Hadoop High Availability
- Describe the importance of Namenode, Datanode, JobTracker, TaskTracker in Hadoop
- Use Flume in the Hadoop Distributed File System
- Implement the data flow mechanism used in Flume
Oracle NoSQL Database - Use an Oracle NoSQL database
- Describe the architectural components (Shard, Replica, Master) of the Oracle NoSQL database
- Set up the KVStore
- Use KVLite to test NoSQL applications
- Integrate an Oracle NoSQL database with an Oracle database and Hadoop
Cloudera Enterprise Hadoop Distribution - Describe the Hive architecture
- Set up Hive with formatters and SerDe
- Implement the Oracle Table Access for a Hadoop Connector
- Describe the Impala real-time query and explain how it differs from Hive
- Create a database and table from a Hadoop Distributed File System file in Hive
- Use Pig Latin to query data in HDFS
- Execute a Hive query
- Move data from a database to a Hadoop Distributed File System using Sqoop
Programming with R - Describe the Oracle R Advanced Analytics for a Hadoop connector
- Use Oracle R Advanced Analytics for a Hadoop connector
- Describe the architectural components of Oracle R Advanced Analytics for Hadoop
- Implement an Oracle Database connection with Oracle R Enterprise
Oracle Loader for Hadoop - Explain the Oracle Loader for Hadoop
- Configure the online and offline options for the Oracle Loader for Hadoop
- Load Hadoop Distributed File System Data into an Oracle database
Oracle SQL Connector for Hadoop Distributed File System (HDFS) - Configure an external table for HDFS using the Oracle SQL Connector for Hadoop
- Install the Oracle SQL Connector for Hadoop
- Describe the Oracle SQL Connector for Hadoop Connector
Oracle Data Integrator (ODI) and Hadoop - Use ODI to transform data from Hive to Hive
- Use ODI to move data from Hive to Oracle
- Use ODI to move data from an Oracle database to a Hadoop Distributed File System using sqoop
- Configure the Oracle Data Integrator with Application Adaptor for Hadoop to interact with Hadoop
Big Data SQL - Explain how Big Data SQL is used in a Big Data Appliance/Exadata architecture
- Set up and configure Oracle Big Data SQL
- Demonstrate Big Data SQL syntax used in create table statements
- Access NoSQL and Hadoop data using a Big Data SQL query
Xquery for Hadoop Connector - Set up Oracle Xquery for Hadoop connector
- Perform a simple Xquery using Oracle XQuery for Hadoop
- Use Oracle Xquery with Hadoop-Hive to map an XML file into a Hive table
Securing Hadoop - Describe Oracle Big Data Appliance security and encryption features
- Set up Kerberos security in Hadoop
- Set up the Hadoop Distributed File System to use Access Control Lists
- Set up Hive and Impala access security using Apache Sentry
- Use LDAP and the Active directory for Hadoop access control

The Oracle 1Z0-449 Certification Program certifies candidates on skills and knowledge related to Oracle Big Data X4-2 products and technologies. The Oracle Big Data 2017 Certification Implementation Specialist is granted based on a combination of passing exams, training and performance-based assignments, depending on the level of certification. Oracle 1Z0-449 certification is tangible benchmark of experience and expertise that help you stand out in a crowd among employers. To ensure success, Oracle recommends combining education courses, practice exam and hands-on experience to prepare for your Oracle Big Data 2017 Implementation Essentials certification exam as questions will test your ability to apply the knowledge you have gained in hands-on practice or professional experience.

Rating: 5 / 5 (16 votes)