hitachi HCE-5920 Exam Questions

Questions for the HCE-5920 were updated on : Nov 21 ,2025

Page 1 out of 4. Viewing questions 1-15 out of 60

Question 1

you want to enable PDI step monitoring Since memory resources are limited. you decide to set a
limit to vie number of snapshots.
Which option should be used to accomplish this task?

  • A. the Logging interval(seconds)' option in the Performance section
  • B. The 'Limit' option in the Snapshot tab
  • C. the 'KETTLE_MAX_JOB_TRACKER_SIZE' option in kettle properties.
  • D. The 'Maximum number of snapshots in memory option in the Monitoring tab
Answer:

D

User Votes:
A
50%
B
50%
C
50%
D
50%

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 2

You have slow-running steps in a PDI transformation and you notice that it is taking a long time for
subsequent steps to get data and begin processing.
Which action will help solve the problem?

  • A. Reduce the value in the Nr’ of rows in rowset’ option on the Miscellaneous tab in the Transformation properties.
  • B. Select the ‘Enable step performance monitoring?’ option on the Monitoring tab in the Transformation properties.
  • C. Right click on the slow-running steps and select the Load Balance option from the Data Movement submenu.
  • D. Select the ‘Execute for every input row? Option on the Advanced tab of the Transformation properties from the parent job.
Answer:

A

User Votes:
A
50%
B
50%
C
50%
D
50%

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 3

After monitoring a performance bottleneck in transformation you identified a user Defined Java
Expression step as the problem.
Choose 2 answers

  • A. Replace the step with a Modified Java Script Value step
  • B. Increase the Change Number of Copies to Start for tie stop
  • C. Replace me step with a User Defined Java Class step
  • D. Split the step into multiple consecutive steps
Answer:

C, D

User Votes:
A
50%
B
50%
C
50%
D
50%

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 4

A customer's transformation Is running slowly in a lest environment. You have access to Spoon and
you can run and monitor the job.
How do you troubleshoot this problem?

  • A. Execute the transformation via the pan script and pass the performance gathering parameter.
  • B. Ensure there is enough memory on the Pentaho server and that there are no "Out Of Memory' errors in the log.
  • C. Make sure the customer is using data partitioning to ensure parallel processing for faster execution
  • D. Verify that there are no bottleneck slaps m the transformation by comparing the amount of rows in the input buffer versus the output buffer within the Step Metrics tab
Answer:

A

User Votes:
A
50%
B
50%
C
50%
D
50%

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 5

A transformation is running in a production environment and you want to monitor it in real time.
Which tool should you use?

  • A. Pentaho Operations Mart
  • B. Kettle status page
  • C. Log4j
  • D. Monitoring tab
Answer:

D

User Votes:
A
50%
B
50%
C
50%
D
50%

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 6

The log files on a Pentaho server rotate daily and are getting too large. You want to change want logs
to rotate hourly.
how do you adjust the log settings?

  • A. Modify the transformation's properties.
  • B. Modify the Pentaho server's tog4j.xml.
  • C. Modify the Tomcat startup bat or startup.sh script
  • D. Modify the start-pentaho bat or start-pentaho.sh script
Answer:

A

User Votes:
A
50%
B
50%
C
50%
D
50%

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 7

A customer has an archive-based installation. They have not configured logging tables or changed the
default configuration settings. They need to research an issue that has been affecting one of their
scheduled PDI jobs for the past week.
In this situation, where do they go to view more details about the execution of these jobs?

  • A. pentaho-server/tomcat/logs
  • B. pentaho-service/data
  • C. pentaho-server/pentaho
  • D. solutions/system/karaf/system/org/apache/karaf/log pentaho/server/pentaho-solution/system/karaf/System/commons logging
Answer:

B

User Votes:
A
50%
B
50%
C
50%
D
50%

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 8

You need to design a PDI job that will execute a transformation and then send an e-mail with an
attached log of the transformation’s execution.
Which two sets of actions will accomplish this task? (Choose two.)
Choose 2 answers

  • A. In the mail entry’s options, select the ‘Attach files to message’ option and select the file type ‘Log’
  • B. In the Transformation entry option, select the ‘Specify logfile’ option and enter a name and extension for the file.
  • C. In the log tab of job properties, configure the Log Connection and the Log table options for the Job entry log table section.
  • D. In the Mail entry’s options, select the ‘Attach files to message’ option and select the file type General’
Answer:

B, D

User Votes:
A
50%
B
50%
C
50%
D
50%

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 9

What are Job checkpoints?

  • A. Error handing hops that manage e-mail notifications.
  • B. Points in a job that have finished successfully and are automatically skipped after a previously failed run
  • C. Points in a job that log the step metrics during certain intervals
  • D. Points in a job that sends a heartbeat request to the Pentaho server to ensure connectivity.
Answer:

C

User Votes:
A
50%
B
50%
C
50%
D
50%

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 10

You are connecting to a secure Hadoop dueler from Pentaho and want to use impersonation.
Which Pentaho tool should you use?

  • A. Pentaho Report Designer
  • B. Pentaho Spoon
  • C. Pentaho Security Manager
  • D. Pentaho Server
Answer:

A

User Votes:
A
50%
B
50%
C
50%
D
50%

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 11

You are planning to connect to a secured Hadoop duster
from Pentaho.
Which two authentication methods are supported? (Choose two)
Choose 2 answers

  • A. Trusted
  • B. Keytan
  • C. Password
  • D. X.509 Certificate
Answer:

A, B

User Votes:
A
50%
B
50%
C
50%
D
50%

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 12

A new customer has pre-existing Java MapReduce jobs.
How does the customer execute these jobs within PDI?

  • A. using the Pentaho MapReduce entry
  • B. using the Hadoop Job Executor entry
  • C. using Pig Script Executor entry
  • D. using Sqoop Import entry
Answer:

A

User Votes:
A
50%
B
50%
C
50%
D
50%

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 13

Which three file formats are splittable on HDFS? (Choose three).
Choose 3 answers
Choose 3 answers

  • A. txt
  • B. xml
  • C. Parquet
  • D. xisx
  • E. Avro
Answer:

A, C, D

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%

Discussions
vote your answer:
A
B
C
D
E
0 / 1000

Question 14

You are running a Pentaho MapReduce (PMR) job that is facing on Hadoop. You review the YARN logs
and determine that the mappers are generating out of memory errors.
Which action will resolve the mapper errors?

  • A. Edit me Pentaho server startup script to increase the memory setting
  • B. Increase the Number of mapper tacks in the PMR step.
  • C. Set the JVM memory parameters appropriately in the User Denned tab of the PMR step
  • D. Set the Enable blocking option in the PMR step.
Answer:

A

User Votes:
A
50%
B
50%
C
50%
D
50%

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 15

A Big Data customer is experiencing failures on a Table input step when running a PDl transformation
on AEL Spark against a large Oracle database.
What are two methods to resolve this issue? (Choose two.)
Choose 2 answers

  • A. Increase the maximum size of the message butters tor your AEL environment.
  • B. Load the data to HDFS before running the transform.
  • C. Add the Step ID to the Configuration File.
  • D. Increase the Spark driver memory configuration.
Answer:

A, B

User Votes:
A
50%
B
50%
C
50%
D
50%

Discussions
vote your answer:
A
B
C
D
0 / 1000
To page 2