Questions for the NS0-604 were updated on : Dec 01 ,2025
A customer wants to automatically scan, analyze, and categorize their data and then take any
required actions. Which NetApp tool should the customer use?
B
Explanation:
To automatically scan, analyze, and categorize data and take necessary actions, the customer should
use BlueXP Classification. This tool helps organizations gain insights into their data by scanning and
classifying it, allowing them to manage sensitive data, meet compliance requirements, and take
appropriate actions based on the analysis.
BlueXP Observability (A) focuses on monitoring and performance tracking, BlueXP Digital Advisor (C)
provides system recommendations, and Harvest (D) is a monitoring tool for ONTAP environments but
not focused on data classification and action automation.
Cyber Monday is quickly approaching as supply chain issues delay a large server shipment to the data
center. The VMware environment is storage-heavy and needs to rapidly grow using Azure services.
Which two technologies should the company use to resolve the issue? (Choose two.)
A, D
Explanation:
To rapidly expand a storage-heavy VMware environment using Azure services, the company should
utilize:
Azure VMware Solution (A): This service allows VMware workloads to be seamlessly migrated or
expanded into Azure, providing a familiar environment with the scalability and flexibility of the Azure
cloud. It's an ideal solution for extending on-premises VMware deployments to Azure.
Azure NetApp Files (D): Azure NetApp Files provides highly performant, scalable file storage in Azure.
It integrates well with Azure VMware Solution, offering a robust storage backend for VMware
workloads, especially when they are storage-heavy.
Amazon FSx for NetApp ONTAP (B) and Google Cloud VMware Engine (C) are not relevant in this
Azure-specific scenario.
A NetApp BlueXP observability customer wants to update security keys for one of their acquisition
units. Which method should the customer use to update or rotate their keys?
A
Explanation:
To update or rotate security keys for one of their acquisition units in NetApp BlueXP observability,
the customer should use the SecurityAdmin tool. This tool is designed to manage security-related
configurations, including key updates and rotations, ensuring secure management of encryption keys
and certificates.
User Management (B) handles user roles and access, Cloud Central (C) is for general cloud services
management, and the Workload Security menu (D) focuses on security monitoring and enforcement
rather than key management.
Which types of NetApp Encryption are supported with the cloud provider's key vault with NetApp
Cloud Volumes ONTAP?
D
Explanation:
NetApp Cloud Volumes ONTAP supports NetApp Volume Encryption (NVE) with the cloud provider's
key vault for encryption key management. NVE provides encryption at the volume level and
integrates with external key management systems, such as AWS KMS, Azure Key Vault, and Google
Cloud KMS, making it the appropriate encryption solution for cloud deployments.
NetApp Aggregate Encryption (NAE) (C) is used for on-premises environments. Transparent Data
Encryption (TDE) (A) is commonly used for database encryption, and Onboard Key Manager (OKM)
(B) is a different key management solution not tied to cloud key vaults.
A customer has 100TB of used capacity after efficiencies on an on-premises AFF volume. There is a
requirement to tier cold data to Amazon Simple Storage Service (Amazon S3) with BlueXP tiering.
There is also a requirement to back up the data with BlueXP backup and recovery to Amazon S3.
After enabling tiering, 80% of cold data is tiered, then the first full backup is completed.
What is the total ingress traffic into AWS?
A
Explanation:
In this scenario, the customer has 100TB of used capacity on an on-premises AFF volume, and 80% of
the data is cold and tiered to Amazon S3 using BlueXP tiering. After tiering, 80TB of cold data is tiered
to Amazon S3, leaving 20TB of hot data on the AFF system. When BlueXP backup and recovery
performs the first full backup, it backs up all the data (100TB). Since the backup is a full copy and
independent of the tiering process, the total ingress traffic into AWS is 80TB (tiered data) + 100TB
(full backup), resulting in 180TB of total ingress.
A customer wants to prevent deletion of volumes and snapshots by a rogue administrator. They do
not want an option to assign a trusted storage administrator to delete the snapshot.
Which two solutions should the customer Implement? (Choose two.)
B, D
Explanation:
To prevent the deletion of volumes and snapshots by a rogue administrator without the option to
assign a trusted administrator, the customer should implement:
SnapLock Enterprise (B): SnapLock is a feature that provides WORM (Write Once, Read Many)
protection, ensuring that volumes or snapshots cannot be deleted or modified for a set retention
period, even by administrators.
Tamperproof NetApp Snapshot copies (D): Snapshots in ONTAP can be made tamperproof to protect
data from deletion or modification, securing them against rogue administrators.
Multi-admin verification (A) requires approval from multiple administrators, which the customer
does not want. Role-based access control (C) helps manage permissions but does not provide
protection against a rogue administrator with elevated permissions.
A customer wants an application-aware data management solution for Kubernetes clusters. The
customer wants to install this solution on-premises on their own hardware.
Which two solutions should the customer deploy? (Choose two.)
C, D
Explanation:
For an application-aware data management solution for Kubernetes clusters that can be deployed
on-premises on the customer's own hardware, the following two solutions should be deployed:
NetApp ONTAP AFF (C): ONTAP AFF systems provide enterprise-grade storage with Kubernetes
integration, allowing the customer to manage Kubernetes workloads with advanced data
management features like snapshots and replication.
NetApp Astra Control Center (D): Astra Control Center is designed for on-premises environments and
provides application-aware data management for Kubernetes clusters. It helps with backup, restore,
and migration for containerized applications on the customer’s infrastructure.
Azure NetApp Files (A) and Astra Control Service (B) are cloud-based solutions and are not designed
for on-premises deployments.
A customer has on-premises NetApp systems and wants information about data to migrate to Azure.
Which dashboard in NetApp BlueXP digital advisor should the customer use?
C
Explanation:
To get insights about which data to migrate from on-premises NetApp systems to Azure, the
customer should use the Cloud Recommendations dashboard in NetApp BlueXP Digital Advisor. This
dashboard analyzes the on-premises environment and provides recommendations on which
workloads or datasets are best suited for migration to the cloud, such as to Azure.
Other dashboards like Valuable Insights (A) and Health Check (B) provide general system health and
performance information, while Keystone Advisor (D) relates to NetApp’s subscription-based storage
offering.
Which network configuration is required for NetApp BlueXP to discover an on-premises NetApp
cluster?
A
Explanation:
For NetApp BlueXP to discover an on-premises NetApp cluster, the network must be configured to
allow outbound 443 access to the BlueXP service. Port 443 is used for secure HTTPS communication,
and BlueXP needs to establish an outbound connection from the on-premises NetApp cluster to the
cloud-based BlueXP service for discovery and management.
Inbound 443 access (B and C) is not required for discovery, and outbound 443 access to the
Connector IP address (D) is relevant only when interacting with the BlueXP Connector, not for cluster
discovery.
A customer has an on-premises AFF cluster and needs to replicate a NAS volume to Azure NetApp
Files. Which replication technology should the customer use?
C
Explanation:
To replicate a NAS volume from an on-premises AFF (All-Flash FAS) cluster to Azure NetApp Files, the
customer should use NetApp BlueXP Replication. This replication technology facilitates data
synchronization and replication between ONTAP systems and Azure NetApp Files, making it ideal for
hybrid cloud data mobility.
NetApp BlueXP copy and sync (A) is for file migration, BlueXP tiering (B) is for storage optimization,
and Azure Site Recovery (D) is focused on VM disaster recovery, not NAS volume replication.
A hospital needs to continuously scan a variety of data sources to verify that they are meeting
regulatory compliance.
Which NetApp BlueXP cloud services solution should the hospital use?
C
Explanation:
For continuously scanning various data sources to ensure regulatory compliance, NetApp BlueXP
Classification is the appropriate solution. This service helps organizations identify and classify
sensitive data across their environments, ensuring that they meet compliance requirements such as
healthcare regulations (HIPAA, for example).
Operational resiliency (A) focuses on system reliability, Digital advisor (B) offers system performance
insights, and Ransomware protection (D) deals with security threats rather than compliance
scanning.
A customer has a cloud-first strategy and wants to protect data against ransomware. The customer
wants to use the NetApp Autonomous Ransomware Protection feature.
Which solution should the customer use?
A
Explanation:
To protect data against ransomware, NetApp Cloud Volumes ONTAP offers the NetApp Autonomous
Ransomware Protection feature. This feature uses machine learning and data analytics to detect and
respond to abnormal file activities, helping prevent ransomware attacks.
Azure NetApp Files (B), Amazon FSx for NetApp ONTAP (C), and NetApp Cloud Volumes Service (D)
provide robust data services, but Cloud Volumes ONTAP specifically includes the Autonomous
Ransomware Protection feature.
A company is moving out of data centers and plans to leverage Azure cloud. An architect needs to
move the existing SQL database to cloud.
Which native cloud solution can the architect use to enable Independent scale storage from
compute?
A
Explanation:
For migrating an existing SQL database to the cloud while enabling independent scaling of storage
and compute, Azure NetApp Files is the best solution. Azure NetApp Files provides high-performance
file storage, and it allows users to independently scale storage capacity and compute resources
based on the needs of the workload, including databases like SQL.
Other options like NetApp Cloud Volumes Service (B) and NetApp Cloud Volumes ONTAP (C) offer
similar capabilities but are not native to Azure. Amazon FSx for NetApp ONTAP (D) is an AWS-specific
service and does not fit with Azure cloud plans.
A customer is using NetApp ONTAP software and wants to tier data from ONTAP clusters with all-SSD
aggregates or all-HDD aggregates to the Microsoft Azure cloud platform.
Which two best practices will enhance the customer's performance? (Choose two.)
A, B
Explanation:
To enhance the performance when tiering data from NetApp ONTAP clusters to the Microsoft Azure
cloud platform, the following best practices should be considered:
Establish a VNet service endpoint to Azure storage (A): This ensures secure and optimized access to
Azure Blob storage directly from the ONTAP cluster, minimizing latency.
Create an Azure ExpressRoute connection (B): ExpressRoute provides a dedicated, high-performance
connection between the on-premises ONTAP clusters and Azure Blob storage, reducing latency and
increasing throughput.
While choosing the IPspace (C) is important for network configuration, it doesn’t directly enhance
performance for cloud-tiering. An HTTPS connection over port 443 (D) is for secure data transfer but
isn’t specifically performance-enhancing.
When deploying NetApp Cloud Volumes ONTAP, which Amazon Elastic Container Service (Amazon
ECS) workload characteristic is used to size an application that uses large sequential I/O?
A
Explanation:
When deploying NetApp Cloud Volumes ONTAP for an Amazon Elastic Container Service (Amazon
ECS) workload that involves large sequential I/O, the sizing characteristic to consider is MBps
(megabytes per second). Large sequential I/O workloads, such as video streaming or backup
operations, typically rely on high throughput rather than random IOPS.
Capacity (B) and IOPS (C) are important for other workload types, while CPU (D) is less relevant for
storage throughput requirements in this context.