microsoft AI-100 Exam Questions

Questions for the AI-100 were updated on : Jul 20 ,2024

Page 1 out of 15. Viewing questions 1-15 out of 219

Question 1 Topic 1, Case Study 1Case Study Question View Case

You need to design the Butler chatbot solution to meet the technical requirements.
What is the best channel and pricing tier to use? More than one answer choice may achieve the goal. Select the BEST
answer.

  • A. Standard channels that use the S1 pricing tier
  • B. Standard channels that use the Free pricing tier
  • C. Premium channels that use the Free pricing tier
  • D. Premium channels that use the S1 pricing tier
Answer:

D

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
References:
https://azure.microsoft.com/en-in/pricing/details/bot-service/

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 2 Topic 1, Case Study 1Case Study Question View Case

You need to meet the testing requirements for the data scientists.
Which three actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

  • A. Deploy an Azure Kubernetes Service (AKS) cluster to the East US 2 region
  • B. Get the docker image from mcr.microsoft.com/azure-cognitive-services/sentiment:latest
  • C. Deploy an Azure an Azure Container Service cluster to the West Europe region
  • D. Export the production version of the Language Understanding (LUIS) app
  • E. Deploy a Kubernetes cluster to Azure Stack
  • F. Get the docker image from mcr.microsoft.com/azure-cognitive-services/luis:latest
  • G. Export the staging version of the Language and Understanding (LUIS) app
Answer:

E F G

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%
F
50%
G
50%

Explanation:
Scenario: Data scientists must test Butler by using ASDK.
Note: Contoso wants to provide a new version of the Bookings app that will provide a highly available, reliable service for
booking travel packages by interacting with a chatbot named Butler.
E: The ASDK (Azure Stack Development Kit) is meant to provide an environment in which you can evaluate Azure Stack and
develop modern applications using APIs and tooling consistent with Azure in a non-production environment.
Microsoft Azure Stack integrated systems range in size from 4-16 nodes, and are jointly supported by a hardware partner
and Microsoft.
F: The Language Understanding (LUIS) container loads your trained or published Language Understanding model, also
known as a LUIS app, into a docker container and provides access to the query predictions from the container's API
endpoints.
Use the docker pull command to download a container image from the mcr.microsoft.com/azure-cognitive-services/luis
repository:
docker pull mcr.microsoft.com/azure-cognitive-services/luis:latest
G: You can test using the endpoint with a maximum of two versions of your app. With your main or live version of your app
set as the production endpoint, add a second version to the staging endpoint.
Reference: https://docs.microsoft.com/en-us/azure-stack/asdk/asdk-what-is https://docs.microsoft.com/en-
us/azure/cognitive-services/luis/luis-container-howto https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-
concept-test

Discussions
vote your answer:
A
B
C
D
E
F
G
0 / 1000

Question 3 Topic 2, Case Study 2Case Study Question View Case

Which two services should be implemented so that Butler can find available rooms on the technical requirements? Each
correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

  • A. QnA Maker
  • B. Bing Entity Search
  • C. Language Understanding (LUIS)
  • D. Azure Search
  • E. Content Moderator
Answer:

C D

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%

Explanation:
References: https://azure.microsoft.com/en-in/services/cognitive-services/language-understanding-intelligent-service/

Discussions
vote your answer:
A
B
C
D
E
0 / 1000

Question 4 Topic 2, Case Study 2Case Study Question View Case

DRAG DROP
You need to integrate the new Bookings app and the Butler chabot.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the
answer area and arrange them in the correct order.
Select and Place:

Answer:


Explanation:
References: https://docs.microsoft.com/en-us/azure/bot-service/bot-service-channel-connect-webchat?view=azure-bot-
service-4.0

Discussions
0 / 1000

Question 5 Topic 2, Case Study 2Case Study Question View Case

You need to meet the greeting requirements for Butler.
Which type of authentication should you use?

  • A. AdaptiveCard
  • B. SigninCard
  • C. CardCarousel
  • D. HeroCard
Answer:

D

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
Scenario: Butler must greet users by name when they first connect.
HeroCard defines a card with a large image, title, text, and action buttons.
Incorrect Answers:
B: SigninCard defines a card that lets a user sign in to a service.
References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-send-welcome-message

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 6 Topic 3, Mixed Questions

HOTSPOT
You are designing an application to parse images of business forms and upload the data to a database. The upload process
will occur once a week.
You need to recommend which services to use for the application. The solution must minimize infrastructure costs.
Which services should you recommend? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Answer:


Explanation:
Box 1: Azure Cognitive Services
Azure Cognitive Services include image-processing algorithms to smartly identify, caption, index, and moderate your pictures
and videos.
Not: Azure Linguistic Analytics API, which provides advanced natural language processing over raw text.
Box 2: Azure Data Factory
The Azure Data Factory (ADF) is a service designed to allow developers to integrate disparate data sources. It is a platform
somewhat like SSIS in the cloud to manage the data you have both on-prem and in the cloud.
It provides access to on-premises data in SQL Server and cloud data in Azure Storage (Blob and Tables) and Azure SQL
Database.
Reference: https://azure.microsoft.com/en-us/services/cognitive-services/
https://www.jamesserra.com/archive/2014/11/what-is-azure-data-factory/

Discussions
0 / 1000

Question 7 Topic 3, Mixed Questions

HOTSPOT You plan to deploy an Azure Data Factory pipeline that will perform the following:
Move data from on-premises to the cloud. Consume Azure Cognitive Services APIs.


You need to recommend which technologies the pipeline should use. The solution must minimize custom code.
What should you include in the recommendation? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Answer:


Explanation:
Box 1: Self-hosted Integration Runtime
A self-hosted IR is capable of running copy activity between a cloud data stores and a data store in private network.
Not Azure-SSIS Integration Runtime, as you would need to write custom code.
Box 2: Azure Logic Apps
Azure Logic Apps helps you orchestrate and integrate different services by providing 100+ ready-to-use connectors, ranging
from on-premises SQL Server or SAP to Microsoft Cognitive Services.
Incorrect:
Not Azure API Management: Use Azure API Management as a turnkey solution for publishing APIs to external and internal
customers.
References: https://docs.microsoft.com/en-us/azure/data-factory/concepts-integration-runtime https://docs.microsoft.com/en-
us/azure/logic-apps/logic-apps-examples-and-scenarios

Discussions
0 / 1000

Question 8 Topic 3, Mixed Questions

HOTSPOT
You need to build an interactive website that will accept uploaded images, and then ask a series of predefined questions
based on each image.
Which services should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Answer:


Explanation:
Box 1: Azure Bot Service
Box 2: Computer Vision
The Computer Vision Analyze an image feature, returns information about visual content found in an image. Use tagging,
domain-specific models, and descriptions in four languages to identify content and label it with confidence. Use Object
Detection to get location of thousands of objects within an image. Apply the adult/racy settings to help you detect potential
adult content. Identify image types and color schemes in pictures.
References: https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/

Discussions
0 / 1000

Question 9 Topic 3, Mixed Questions

You are designing an AI solution that will analyze millions of pictures.
You need to recommend a solution for storing the pictures. The solution must minimize costs.
Which storage solution should you recommend?

  • A. an Azure Data Lake store
  • B. Azure File Storage
  • C. Azure Blob storage
  • D. Azure Table storage
Answer:

C

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
Data Lake will be a bit more expensive although they are in close range of each other. Blob storage has more options for
pricing depending upon things like how frequently you need to access your data (cold vs hot storage).
Reference: http://blog.pragmaticworks.com/azure-data-lake-vs-azure-blob-storage-in-data-warehousing

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 10 Topic 3, Mixed Questions

You are configuring data persistence for a Microsoft Bot Framework application. The application requires a structured
NoSQL cloud data store.
You need to identify a storage solution for the application. The solution must minimize costs.
What should you identify?

  • A. Azure Blob storage
  • B. Azure Cosmos DB
  • C. Azure HDInsight
  • D. Azure Table storage
Answer:

D

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
Table Storage is a NoSQL key-value store for rapid development using massive semi-structured datasets You can develop
applications on Cosmos DB using popular NoSQL APIs.
Both services have a different scenario and pricing model.
While Azure Storage Tables is aimed at high capacity on a single region (optional secondary read only region but no
failover), indexing by PK/RK and storage-optimized pricing; Azure Cosmos DB Tables aims for high throughput (single-digit
millisecond latency), global distribution (multiple failover), SLA-backed predictive performance with automatic indexing of
each attribute/property and a pricing model focused on throughput.
References: https://db-engines.com/en/system/Microsoft+Azure+Cosmos+DB%3BMicrosoft+Azure+Table+Storage

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 11 Topic 3, Mixed Questions

You have an Azure Machine Learning model that is deployed to a web service.
You plan to publish the web service by using the name ml.contoso.com.
You need to recommend a solution to ensure that access to the web service is encrypted.
Which three actions should you recommend? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

  • A. Generate a shared access signature (SAS)
  • B. Obtain an SSL certificate
  • C. Add a deployment slot
  • D. Update the web service
  • E. Update DNS
  • F. Create an Azure Key Vault
Answer:

B D E

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%
F
50%

Explanation:
The process of securing a new web service or an existing one is as follows:
1. Get a domain name.
2. Get a digital certificate.
3. Deploy or update the web service with the SSL setting enabled.
4. Update your DNS to point to the web service.
Note: To deploy (or re-deploy) the service with SSL enabled, set the ssl_enabled parameter to True, wherever applicable.
Set the ssl_certificate parameter to the value of the certificate file and the ssl_key to the value of the key file.
References: https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-secure-web-service

Discussions
vote your answer:
A
B
C
D
E
F
0 / 1000

Question 12 Topic 3, Mixed Questions

Your company recently deployed several hardware devices that contain sensors.
The sensors generate new data on an hourly basis. The data generated is stored on-premises and retained for several
years.
During the past two months, the sensors generated 300 GB of data.
You plan to move the data to Azure and then perform advanced analytics on the data.
You need to recommend an Azure storage solution for the data.
Which storage solution should you recommend?

  • A. Azure Queue storage
  • B. Azure Cosmos DB
  • C. Azure Blob storage
  • D. Azure SQL Database
Answer:

C

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
References: https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/data-storage

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 13 Topic 3, Mixed Questions

You plan to design an application that will use data from Azure Data Lake and perform sentiment analysis by using Azure
Machine Learning algorithms.
The developers of the application use a mix of Windows- and Linux-based environments. The developers contribute to
shared GitHub repositories.
You need all the developers to use the same tool to develop the application.
What is the best tool to use? More than one answer choice may achieve the goal.

  • A. Microsoft Visual Studio Code
  • B. Azure Notebooks
  • C. Azure Machine Learning Studio
  • D. Microsoft Visual Studio
Answer:

C

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
References: https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/machine-learning/studio/algorithm-choice.md

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 14 Topic 3, Mixed Questions

You have several AI applications that use an Azure Kubernetes Service (AKS) cluster. The cluster supports a maximum of
32 nodes.
You discover that occasionally and unpredictably, the application requires more than 32 nodes.
You need to recommend a solution to handle the unpredictable application load.
Which scaling method should you recommend?

  • A. horizontal pod autoscaler
  • B. cluster autoscaler
  • C. manual scaling
  • D. Azure Container Instances
Answer:

B

User Votes:
A
50%
B
50%
C
50%
D
50%

Explanation:
B: To keep up with application demands in Azure Kubernetes Service (AKS), you may need to adjust the number of nodes
that run your workloads. The cluster autoscaler component can watch for pods in your cluster that can't be scheduled
because of resource constraints. When issues are detected, the number of nodes is increased to meet the application
demand. Nodes are also regularly checked for a lack of running pods, with the number of nodes then decreased as needed.
This ability to automatically scale up or down the number of nodes in your AKS cluster lets you run an efficient, cost-effective
cluster.
Reference:
https://docs.microsoft.com/en-us/azure/aks/cluster-autoscaler

Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 15 Topic 3, Mixed Questions

You deploy an infrastructure for a big data workload.
You need to run Azure HDInsight and Microsoft Machine Learning Server. You plan to set the RevoScaleR compute
contexts to run rx function calls in parallel.
What are three compute contexts that you can use for Machine Learning Server? Each correct answer presents a complete
solution.
NOTE: Each correct selection is worth one point.

  • A. SQL
  • B. Spark
  • C. local parallel
  • D. HBase
  • E. local sequential
Answer:

A B C

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%

Explanation:
Remote computing is available for specific data sources on selected platforms. The following tables document the supported
combinations.
RxInSqlServer, sqlserver: Remote compute context. Target server is a single database node (SQL Server 2016 R

Services or SQL Server 2017 Machine Learning Services). Computation is parallel, but not distributed.
RxSpark, spark: Remote compute context. Target is a Spark cluster on Hadoop.

RxLocalParallel, localpar: Compute context is often used to enable controlled, distributed computations relying on

instructions you provide rather than a built-in scheduler on Hadoop. You can use compute context for manual distributed
computing.
References: https://docs.microsoft.com/en-us/machine-learning-server/r/concept-what-is-compute-context

Discussions
vote your answer:
A
B
C
D
E
0 / 1000
To page 2