IBM C1000-150 Exam Questions

Questions for the C1000-150 were updated on : Dec 01 ,2025

Page 1 out of 4. Viewing questions 1-15 out of 60

Question 1

Which type of log collector uses input and output plug-ins to collect data from multiple sources and
to distribute or send data to multiple destinations?

  • A. Journald
  • B. Rsyslog Sidecar
  • C. Fluentd
  • D. Audit Container
Answer:

C

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
Fluentd is a log collector that uses input and output plug-ins to collect data from multiple sources
and to distribute or send data to multiple destinations. This allows Fluentd to collect and process
data from various sources and send it to various destinations with minimal effort.
Reference: [1]
https://docs.fluentd.org/
[2]
https://www.fluentd.org/

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 2

When dealing with OpenShift Container Platform (OCP) logs and log persistence, which component
collects all node and container logs and stores them in a dedicated project indexes?

  • A. Graf ana
  • B. Fluentd
  • C. Elasticsearch
  • D. Logstash
Answer:

B

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
When dealing with OpenShift Container Platform (OCP) logs and log persistence, Fluentd is the
component that collects all node and container logs and stores them in a dedicated project indexes.
Fluentd is an open source data collector that can collect, process, and forward data from a variety of
sources.
Reference:
[1]
https://docs.openshift.com/container-platform/4.5/logging/understanding-logging.html
[2]
https://docs.fluentd.org/

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 3

Which statement is true for the Cloud Pak for Business Automation standard capabilities logging?

  • A. Logging is enabled to collect and forward standard output when configured.
  • B. Logging is enabled by default and logs are stored in a dedicated persistent data store.
  • C. Logging is not stored in a dedicated persistent data store unless specified.
  • D. Logging is viewable only by the OpenShift Container Platform (OCP) web console.
Answer:

A

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
The Cloud Pak for Business Automation standard capabilities logging is enabled to collect and
forward standard output to the specified logging destination when configured. This logging is not
stored in a dedicated persistent data store unless specified, and is viewable only by the OpenShift
Container Platform (OCP) web console.
Reference:
[1]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/bas/logging.html
[2]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/bas/logging_setup.html

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 4

To manually scale up the Process Mining deployment in the IBM Cloud Pak for Business Automation,
which parameter section needs to be updated in the custom resource YAML file?

  • A. install
  • B. license
  • C. namespace
  • D. replicas
Answer:

D

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
To manually scale up the Process Mining deployment in the IBM Cloud Pak for Business Automation,
the replicas parameter section needs to be updated in the custom resource YAML file. This parameter
allows you to specify the desired number of replicas for the deployment.
Reference:
[1]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/bas/bas_install.html#manually_
scale_up_the_process_mining_deployment
[2]
https://kubernetes.io/docs/tasks/run-
application/scale-stateful-set/

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 5

What should be supplied as part of the custom resource prior to deployment if it is desired to use a
root CA signer certificate that is signed by a recognized certificate authority?

  • A. root_ca_certificate
  • B. root_ca_key
  • C. root_ca_store
  • D. root_ca_secret
Answer:

A

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
If it is desired to use a root CA signer certificate that is signed by a recognized certificate authority,
the rootcacertificate should be supplied as part of the custom resource prior to deployment. This is
necessary in order for the root CA signer certificate to be validated.
Reference:
[1]
https://kubernetes.io/docs/tasks/configure-pod-container/configure-pod-
certificates/#running-an-https-server
[2]
https://kubernetes.io/docs/concepts/cluster-
administration/certificates/

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 6

When deploying License Service Reporter, the summary card additionally shows a View license usage
link. The link leads to the License Service Reporter user interface that presents the license usage of
your products within the reporting period for a multi-cluster environment.
What is that license usage?

  • A. Peak Weekly Usage
  • B. Highest License Usage
  • C. Median Average Usage
  • D. Average Daily Usage
Answer:

D

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
When deploying License Service Reporter, the summary card additionally shows a View license usage
link. This link leads to the License Service Reporter user interface that presents the license usage of
your products within the reporting period for a multi-cluster environment. The license usage
presented is Average Daily Usage, which is the average number of licenses used per day in the
reporting period.
Reference:
[1]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/com.ibm.cic.agent.lmgr.user/license_usage.html
[2]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/com.ibm.cic.agent.lmgr.user/view_license_usage.html

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 7

When setting up a demo environment an identity provider may not be known. What can be used to
replace the default admin user with a simple identity provider?

  • A. htpasswd
  • B. htdigest
  • C. passwd
  • D. openss1
Answer:

A

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
When setting up a demo environment an identity provider may not be known. In this case, htpasswd
can be used to replace the default admin user with a simple identity provider. Htpasswd is an Apache
utility for creating and updating user authentication files for the Apache web server. It uses a
combination of plaintext passwords and a hashing algorithm to store its credentials.
Reference:
[1]
https://httpd.apache.org/docs/2.4/programs/htpasswd.html
[2]
https://httpd.apache.org/docs/2.4/howto/auth.html#gettingstarted

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 8

How is the Business Automation Studio web interface accessed?

  • A. Via web browser at URL https : //<host>: <port>/BAStudio/
  • B. Via IBM Cloud Pak process administration console
  • C. Via Workflow Center Web Console
  • D. Via IBM Cloud Pak platform Ul
Answer:

A

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
Reference:
[1]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/bas/getting_started/overview.ht
ml
[2]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/pam/getting_started/overview.html

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 9

What does IBM Cloud Pak foundational services monitoring require?

  • A. Role-based access control (RBAC) to monitor APIs and data
  • B. Red Hat OpenShift Container Platform monitoring to be installed
  • C. Kibana as the datasource
  • D. Adopter customization to query and visualize application metrics
Answer:

A

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
IBM Cloud Pak foundational services monitoring requires Role-based access control (RBAC) to
monitor APIs and data. This ensures that only authorized users have access to the data and APIs that
are being monitored. It also ensures that data is only being accessed by users with the appropriate
permissions. Kibana is used as the data source for the Cloud Pak foundational services monitoring.
Adopter customization is only necessary to query and visualize application metrics. Red Hat
OpenShift Container Platform monitoring is not required for Cloud Pak foundational services
monitoring.
Reference:
[1]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/monitoring/overview.html
[2]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/monitoring/rbac.html

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 10

What is the best data to check for installation and upgrade problems?

  • A. Machine status
  • B. Job status
  • C. Pod status
  • D. Node status
Answer:

C

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
The best data to check for installation and upgrade problems is Pod status. Pods are the smallest
deployable units in a Kubernetes cluster and contain the necessary components to run an
application. Examining the Pod status can help identify any issues that may be present with the
installation or upgrade process. The other options are not related to this process.
Reference:
[1]
https://kubernetes.io/docs/concepts/workloads/pods/
[2]
https://kubernetes.io/docs/tasks/deb
ug-application-cluster/debug-cluster-upgrade/

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 11

Operator log files can be retrieved from where?

  • A. Ansible pod
  • B. home directory
  • C. Ansible directory
  • D. YAML file directory
Answer:

C

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
Operator log files can be retrieved from the Ansible directory. The Ansible directory is located in the
home directory at ~/.ansible/logs.
Reference:
[1]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/cpd/troubleshoot/operator_logs.html
[2]
https://docs.ansible.com/ansible/latest/reference_appendices/config.html#ansible-log-dir

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 12

Which statement is true about a Cloud Pak for Business Automation starter deployment?

  • A. It cannot include the Automation Document Processing capability.
  • B. It takes fewer steps than a production deployment.
  • C. It can be upgraded to a production deployment if required.
  • D. It does not use the Operator Lifecycle Manager.
Answer:

C

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
A Cloud Pak for Business Automation starter deployment can be upgraded to a production
deployment if required. It is designed to provide a quick and easy way to get started with the
capabilities offered by the Cloud Pak for Business Automation. It is possible to include the
Automation Document Processing capability in a starter deployment. The starter deployment uses
the Operator Lifecycle Manager to deploy and manage the components of the Cloud Pak for Business
Automation.
Reference:
[1]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/cpd/getting_started/overview.ht
ml
[2]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/cpd/administer/overview.ht
ml

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 13

Once a starter deployment of the Cloud Pak for Business Automation is installed, where can access to
the different capability services and applications be found?

  • A. By opening a terminal to the ibm-cp4a-operator pod and open the /opt/ibm/cp4ba-access.txt file.
  • B. By opening a config map which contains the route URL to access the components and a secret which contains the credentials to use with the different URLs.
  • C. By opening a config map which contains the route URL to access the components as well as the username and password to use with the URL in clear text.
  • D. By opening the cpd-access route, which leads to a page that lists the components URLs, usernames and passwords to use.
Answer:

B

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
Once a starter deployment of the Cloud Pak for Business Automation is installed, access to the
different capability services and applications can be found by opening a config map which contains
the route URL to access the components and a secret which contains the credentials to use with the
different URLs.
Reference:
[1]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/cpd/getting_started/accessing_components.html
[2]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/cpd/getting_started/overview.html

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 14

Which parameter is required to forward audit logging?

  • A. AUDIT_CONTENT_BY_PROVIDERS
  • B. ENABLE_AUDIT_LOGGING_FORWARDING
  • C. AUDIT_ENABLED
  • D. SAS API SERVER AUDIT ENABLED
Answer:

B

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
To forward audit logging, the ENABLEAUDITLOGGINGFORWARDING parameter is required. This
parameter is used to enable the forwarding of audit logs to an external service.
Reference:
[1]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/cpd/administer/audit.html
[2]
https://www.ibm.com/support/knowledgecenter/SSFTN5_2.2.2/cpd/administer/overview.html

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 15

Which component can have its certificate refreshed after install?

  • A. etcd
  • B. default token
  • C. IPSec
  • D. Helm
Answer:

A

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
After install, the certificate of the etcd component can be refreshed. etcd is a key-value store that
stores the Kubernetes cluster state and is used to secure communication between Kubernetes
components.
Reference:
[1]
https://kubernetes.io/docs/tasks/administer-cluster/configure-upgrade-
etcd/
[2]
https://kubernetes.io/docs/reference/generated/kubectl/kubectl-commands#certificate-
renew

Discussions
vote your answer:
A
B
C
D
0 / 1000
To page 2