AI-102 Solution Planning and Operations

Use for resource design, security, deployment, containers, monitoring, pricing, and other cross-cutting operational choices.

Exams
AI-102
Questions
14
Comments
185

1. AI-102 Topic 1 Question 7

Sequence
5
Discussion ID
74867
Source URL
https://www.examtopics.com/discussions/microsoft/view/74867-exam-ai-102-topic-1-question-7-discussion/
Posted By
SamedKia
Posted At
April 29, 2022, 9:24 a.m.

Question

DRAG DROP -
You plan to use containerized versions of the Anomaly Detector API on local devices for testing and in on-premises datacenters.
You need to ensure that the containerized deployments meet the following requirements:
✑ Prevent billing and API information from being stored in the command-line histories of the devices that run the container.
✑ Control access to the container images by using Azure role-based access control (Azure RBAC).
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
NOTE: More than one order of answer choices is correct. You will receive credit for any of the correct orders you select.
Select and Place:
image

Suggested Answer

image
Answer Description Click to expand


Comments 21 comments Click to expand

Comment 1

ID: 597103 User: dinhhungitsoft Badges: Highly Voted Relative Date: 3 years, 10 months ago Absolute Date: Thu 05 May 2022 05:22 Selected Answer: - Upvotes: 68

I think the it should be:
1. Create a custom docker file
2. Pull the container image (in the dockerfile)
3. Build the image
4 Push to ACR

Comment 1.1

ID: 1247736 User: Aouatef Badges: - Relative Date: 1 year, 8 months ago Absolute Date: Sun 14 Jul 2024 12:39 Selected Answer: - Upvotes: 2

1 - the custom dockerfile prevents the billing information from being stored in commandline histories
2 - pull and build are necessary to take into account the requirements defined in the dockerfile
3 - Push to acr where we can define RBAC access rules to the new image

Comment 2

ID: 906654 User: mmaguero Badges: Highly Voted Relative Date: 1 year, 5 months ago Absolute Date: Mon 23 Sep 2024 06:18 Selected Answer: - Upvotes: 31

ChatGPT: To meet the requirements, perform the following actions in sequence:

1. Create a custom Dockerfile: This allows you to define the specific configuration and dependencies for your containerized deployment.

2. Build the image: Use the Dockerfile to build the container image with the necessary components and configurations.

3. Push the image to an Azure container registry: Store the container image in an Azure container registry to control access and ensure secure storage.

4. Distribute a docker run script: Provide a docker run script that can be used to run the containerized deployment on local devices or in on-premises datacenters. This script should include any necessary environment variables or configuration settings.

Note: The actions of pulling the Anomaly Detector container image and pushing the image to Docker Hub are not mentioned in the requirements. Therefore, they are not necessary for meeting the given requirements.

Comment 3

ID: 1717965 User: LovelyGroovey Badges: Most Recent Relative Date: 1 week, 4 days ago Absolute Date: Sat 28 Feb 2026 18:27 Selected Answer: - Upvotes: 1

Microsoft Co-Pilot said Pull the Anomaly Detector container image is the starting point. See: https://learn.microsoft.com/en-us/azure/ai-services/anomaly-detector/anomaly-detector-container-howto Then you can Create a custom Dockerfile

Comment 4

ID: 1710277 User: sdk1 Badges: - Relative Date: 1 month, 1 week ago Absolute Date: Thu 29 Jan 2026 19:21 Selected Answer: - Upvotes: 1

Pull the Anomaly Detector container image: first obtain the official base image from the Microsoft Container Registry.
Create a custom Dockerfile: To prevent secrets from appearing in CLI history
Build the image: To processes the Dockerfile and creates "hardened" version of the container locally.
Push the image to an Azure container registry: Azure Container Registry (ACR) supports Azure RBAC.

Comment 5

ID: 1324788 User: MASANASA Badges: - Relative Date: 1 year, 3 months ago Absolute Date: Wed 11 Dec 2024 01:59 Selected Answer: - Upvotes: 2

it should be like this steps?

Create a custom Dockerfile
Pull the Anomaly Detector Container Image
Build the Image
Push the image to an Azure Container registry

Comment 6

ID: 1305691 User: angelrishi Badges: - Relative Date: 1 year, 4 months ago Absolute Date: Fri 01 Nov 2024 09:03 Selected Answer: - Upvotes: 7

Incorrect. Correct answer is:
1. Create custom docker file (as it includes instructions to pull the latest image of anomaly detector)
2. Build the image
3. Push to ACR
4. Distribute Docker Run

Comment 7

ID: 1247079 User: SAMBIT Badges: - Relative Date: 1 year, 5 months ago Absolute Date: Mon 23 Sep 2024 06:18 Selected Answer: - Upvotes: 3

Pull the Anomaly Detector container image (b):
Start by retrieving the official Anomaly Detector container image from a registry. You can use the docker pull command to download it.

Create a custom Dockerfile (a):
Next, create a custom Dockerfile that defines the instructions for building your specific image. Customize it to include your requirements, such as preventing sensitive information from being stored in command-line histories.

Build the image (e):
Use the Dockerfile to build your custom image. This step compiles the necessary components and configurations.

Push the image to an Azure container registry (d):
Finally, store your custom image in an Azure container registry. This ensures controlled access and secure storage.

Comment 8

ID: 1261088 User: RahulShahane Badges: - Relative Date: 1 year, 5 months ago Absolute Date: Mon 23 Sep 2024 06:18 Selected Answer: - Upvotes: 1

1. Create a custom docker file
2. Pull the container image (in the dockerfile)
3. Build the image
4 Push to ACR
This is highly voted answer. Is it correct?....pls help

Comment 9

ID: 1271055 User: testmaillo020 Badges: - Relative Date: 1 year, 6 months ago Absolute Date: Fri 23 Aug 2024 06:15 Selected Answer: - Upvotes: 2

This is the right one:

1. Pull the Anomaly Detector container image.
2. Create a custom Dockerfile.
3. Build the image.
4. Push the image to an Azure container registry.

Comment 10

ID: 1235714 User: rookiee1111 Badges: - Relative Date: 1 year, 8 months ago Absolute Date: Sun 23 Jun 2024 09:21 Selected Answer: - Upvotes: 3

1. Pull the anomaly detector
2. build docker file
3. build image
4. publish to ACR
this now will become source for all devices to pull the image and run from ACR.

Comment 11

ID: 1217632 User: nanaw770 Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Fri 24 May 2024 17:02 Selected Answer: - Upvotes: 2

1.Create a custom Dockerfile
2.Build the image
3.Push the image to an Azure container registry
4.Distribute a docker run script

From Takedajuku perspective, if you study for 4 days and spend 2 days reviewing, you will have a better chance of passing the exam.

Comment 12

ID: 1209246 User: TJ001 Badges: - Relative Date: 1 year, 10 months ago Absolute Date: Fri 10 May 2024 08:59 Selected Answer: - Upvotes: 2

it is important to note NOTE: More than one order of answer choices is correct.

Comment 12.1

ID: 1353897 User: miai74uu Badges: - Relative Date: 1 year, 1 month ago Absolute Date: Sun 09 Feb 2025 13:51 Selected Answer: - Upvotes: 1

you rock, this shows that we have to read carefully the question

Comment 13

ID: 1209245 User: TJ001 Badges: - Relative Date: 1 year, 10 months ago Absolute Date: Fri 10 May 2024 08:56 Selected Answer: - Upvotes: 2

1. Pull the image
2. Create dockerfile
3. Build
4. Run
5. Push to ACR if required

Comment 13.1

ID: 1221907 User: TJ001 Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Fri 31 May 2024 04:59 Selected Answer: - Upvotes: 2

while this order is fine technically, after reading the question again i think it is important to push the image to ACR to manage RBAC..So skipping the docker pull and instead pull directly from MCR in the dockerfile is better so my 4 choices are below
1. Create dockerfile
2. Build
3. Push to ACR if required
4. Run

Comment 14

ID: 1205012 User: tolliik Badges: - Relative Date: 1 year, 10 months ago Absolute Date: Wed 01 May 2024 12:41 Selected Answer: - Upvotes: 6

1. Create.
2. Build.
3. Push
4. Run
Instruction example hello-world image
https://learn.microsoft.com/en-us/azure/container-registry/container-registry-quickstart-task-cli

Comment 14.1

ID: 1220140 User: vovap0vovap Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Tue 28 May 2024 12:27 Selected Answer: - Upvotes: 1

Yeah, but instructions say "datacenters" - plural. That means basically that step "Distribute a docker run script" is there for a reason.

Comment 15

ID: 1182255 User: varinder82 Badges: - Relative Date: 1 year, 11 months ago Absolute Date: Mon 25 Mar 2024 07:35 Selected Answer: - Upvotes: 4

Final Answer:
1.Create a custom Dockerfile
2.Build the image
3.Push the image to an Azure container registry
4.Distribute a docker run script:

Comment 16

ID: 1172688 User: Murtuza Badges: - Relative Date: 1 year, 12 months ago Absolute Date: Wed 13 Mar 2024 16:42 Selected Answer: - Upvotes: 4

I think the it should be:
1. Pull the container image (in the dockerfile)
2. Create a custom docker file
3. Build the image
4 Push to ACR

Comment 17

ID: 1147009 User: evangelist Badges: - Relative Date: 2 years, 1 month ago Absolute Date: Sun 11 Feb 2024 07:19 Selected Answer: - Upvotes: 11

check Azure official on youtube: https://www.youtube.com/watch?v=XLQLNazid4I
The steps should be
1. Pull the container image (in the dockerfile)
2. Create a custom docker file
3. Build the image
4 Push to ACR

2. AI-102 Topic 1 Question 6

Sequence
24
Discussion ID
74870
Source URL
https://www.examtopics.com/discussions/microsoft/view/74870-exam-ai-102-topic-1-question-6-discussion/
Posted By
PHD_CHENG
Posted At
April 29, 2022, 10:10 a.m.

Question

You are developing a new sales system that will process the video and text from a public-facing website.
You plan to monitor the sales system to ensure that it provides equitable results regardless of the user's location or background.
Which two responsible AI principles provide guidance to meet the monitoring requirements? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

  • A. transparency
  • B. fairness
  • C. inclusiveness
  • D. reliability and safety
  • E. privacy and security

Suggested Answer

BC

Answer Description Click to expand


Community Answer Votes

Comments 17 comments Click to expand

Comment 1

ID: 627552 User: Eltooth Badges: Highly Voted Relative Date: 3 years, 8 months ago Absolute Date: Tue 05 Jul 2022 19:04 Selected Answer: BC Upvotes: 27

B and C are correct answers.

Comment 2

ID: 1085160 User: natgurulearning Badges: Highly Voted Relative Date: 2 years, 3 months ago Absolute Date: Fri 01 Dec 2023 12:02 Selected Answer: - Upvotes: 7

A and B 100%, key word is to Monitor
Transparency means you should be clear about how the system works and how it makes decisions, so you can monitor the results and ensure that they're not biased against any particular group of users. And fairness means that the system should be designed to treat all users equally, regardless of their location or background. By monitoring the system to ensure that it's transparent and fair, you can help to prevent any unintended biases or discrimination in the results. It's all about being ethical and responsible in the development and use of AI systems.

Comment 3

ID: 1706386 User: mateuszwdowiak Badges: Most Recent Relative Date: 1 month, 4 weeks ago Absolute Date: Tue 13 Jan 2026 13:40 Selected Answer: BC Upvotes: 1

Nowadays it has to be B and C.

Comment 4

ID: 1699154 User: Madhusmita Badges: - Relative Date: 2 months, 4 weeks ago Absolute Date: Sat 13 Dec 2025 01:41 Selected Answer: AB Upvotes: 1

These two principles work together to enable effective monitoring:

Fairness defines what you're monitoring for (equitable treatment)
Transparency enables how you monitor (by making the system's behavior visible and explainable)

Comment 5

ID: 1629906 User: PrasadMP Badges: - Relative Date: 3 months, 1 week ago Absolute Date: Tue 02 Dec 2025 16:36 Selected Answer: BC Upvotes: 1

B and C are Correct Answer

Comment 6

ID: 1627801 User: nkcodescribbler Badges: - Relative Date: 3 months, 2 weeks ago Absolute Date: Sun 23 Nov 2025 02:57 Selected Answer: BC Upvotes: 1

B - No discrimination based on background, location, gender, ethnicity, etc.
C - The system is usable and effective for all people, including those from different regions, cultures, or backgrounds.

Comment 7

ID: 1608719 User: spacemess Badges: - Relative Date: 6 months ago Absolute Date: Sat 13 Sep 2025 04:53 Selected Answer: AB Upvotes: 1

The question is about monitoring a system to ensure equitable results across locations and backgrounds, which directly relates to Fairness (avoiding bias) and Transparency (understanding and explaining system behavior).

Why not following:
C. Inclusiveness – Focuses on accessibility and ensuring people of all abilities can use the system, not on monitoring for fairness.
D. Reliability and Safety – Ensures the system performs consistently and safely, but doesn’t directly address equity or bias.
E. Privacy and Security – Protects user data and system integrity, but is unrelated to monitoring for equitable outcomes.

Comment 8

ID: 1608381 User: microbricks Badges: - Relative Date: 6 months ago Absolute Date: Fri 12 Sep 2025 02:49 Selected Answer: AB Upvotes: 1

Inclusiveness helps build the system; fairness and transparency help monitor it.

Comment 9

ID: 1572638 User: man5484 Badges: - Relative Date: 9 months, 2 weeks ago Absolute Date: Tue 27 May 2025 12:03 Selected Answer: BC Upvotes: 1

Fairness Avoiding bias, ensuring equal treatment
Inclusiveness Ensuring access, representation, and equal participation

Comment 10

ID: 1220062 User: tolliik Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Tue 28 May 2024 09:41 Selected Answer: - Upvotes: 1

Why not A&B?
Transparency
Transparency in Azure Machine Learning: The model interpretability and counterfactual what-if components of the Responsible AI dashboard enable data scientists and developers to generate human-understandable descriptions of the predictions of a model

fairness - no comment. it is clear.

Comment 11

ID: 1217631 User: nanaw770 Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Fri 24 May 2024 17:00 Selected Answer: BC Upvotes: 1

It must be B and C.

Comment 12

ID: 1211306 User: JamesKJoker Badges: - Relative Date: 1 year, 10 months ago Absolute Date: Tue 14 May 2024 10:44 Selected Answer: BC Upvotes: 1

BC is obviously correct

Comment 13

ID: 1195249 User: CDL_Learner Badges: - Relative Date: 1 year, 11 months ago Absolute Date: Sun 14 Apr 2024 05:09 Selected Answer: BC Upvotes: 2

B. Fairness: This principle is about ensuring that the AI system provides equitable results regardless of a user’s location or background. It involves minimizing bias in the AI system’s outcomes and providing equal opportunities for all users.
C. Inclusiveness: This principle is about designing AI systems that are accessible and usable by the widest possible range of people, regardless of their location or background. It involves considering the diverse characteristics of potential users during the design and deployment of the AI system.

Comment 14

ID: 1194708 User: sivapolam90 Badges: - Relative Date: 1 year, 11 months ago Absolute Date: Sat 13 Apr 2024 08:49 Selected Answer: BC Upvotes: 2

B and C

Comment 15

ID: 1192638 User: michaelmorar Badges: - Relative Date: 1 year, 11 months ago Absolute Date: Wed 10 Apr 2024 05:12 Selected Answer: BC Upvotes: 1

The requirements to not mention reliability and safety, only that the users are treated equally regardless of their characteristics or background.

Comment 16

ID: 1181739 User: f2c587e Badges: - Relative Date: 1 year, 11 months ago Absolute Date: Sun 24 Mar 2024 16:00 Selected Answer: BC Upvotes: 1

En la pregunta claramente hablan sobre Equidad y la inclusión al abordar que debe ser independiente de la ubicación

Comment 17

ID: 1147004 User: evangelist Badges: - Relative Date: 2 years, 1 month ago Absolute Date: Sun 11 Feb 2024 07:04 Selected Answer: BC Upvotes: 1

transparency (A) is important for making the operations of AI systems understandable to users, and reliability and safety (D), along with privacy and security (E), are crucial for ensuring the system's dependable operation, user safety, and data protection, fairness and inclusiveness are specifically targeted towards ensuring equitable results and addressing the monitoring requirements stated in your scenario.

3. AI-102 Topic 1 Question 70

Sequence
27
Discussion ID
150682
Source URL
https://www.examtopics.com/discussions/microsoft/view/150682-exam-ai-102-topic-1-question-70-discussion/
Posted By
a8da4af
Posted At
Nov. 3, 2024, 10:03 p.m.

Question

You have an Azure DevOps pipeline named Pipeline1 that is used to deploy an app. Pipeline1 includes a step that will create an Azure AI services account.

You need to add a step to Pipeline1 that will identify the created Azure AI services account. The solution must minimize development effort.

Which Azure Command-Line Interface (CLI) command should you run?

  • A. az resource link
  • B. az cognitiveservices account network-rule
  • C. az cognitiveservices account show
  • D. az account list

Suggested Answer

C

Answer Description Click to expand


Community Answer Votes

Comments 3 comments Click to expand

Comment 1

ID: 1704368 User: IDLVN Badges: - Relative Date: 2 months ago Absolute Date: Tue 06 Jan 2026 04:39 Selected Answer: C Upvotes: 1

C indeed

Comment 2

ID: 1313308 User: 9cc71b6 Badges: - Relative Date: 1 year, 3 months ago Absolute Date: Sun 17 Nov 2024 01:10 Selected Answer: C Upvotes: 1

Pretty simple

Comment 3

ID: 1306644 User: a8da4af Badges: - Relative Date: 1 year, 4 months ago Absolute Date: Sun 03 Nov 2024 22:03 Selected Answer: C Upvotes: 3

Correct, answer is C, here the chatGPT explanation ->

The correct answer is C. az cognitiveservices account show.

Explanation:

The az cognitiveservices account show command retrieves details about a specified Azure Cognitive Services account, including its properties and settings. By using this command in the Azure DevOps pipeline, you can identify and confirm the details of the Azure AI services account that was created in a previous step.

4. AI-102 Topic 1 Question 66

Sequence
38
Discussion ID
135094
Source URL
https://www.examtopics.com/discussions/microsoft/view/135094-exam-ai-102-topic-1-question-66-discussion/
Posted By
Mehe323
Posted At
March 3, 2024, 7:44 a.m.

Question

You are developing a system that will monitor temperature data from a data stream. The system must generate an alert in response to atypical values. The solution must minimize development effort.

What should you include in the solution?

  • A. Multivariate Anomaly Detection
  • B. Azure Stream Analytics
  • C. metric alerts in Azure Monitor
  • D. Univariate Anomaly Detection

Suggested Answer

D

Answer Description Click to expand


Community Answer Votes

Comments 19 comments Click to expand

Comment 1

ID: 1191667 User: TT924 Badges: Highly Voted Relative Date: 1 year, 11 months ago Absolute Date: Mon 08 Apr 2024 17:56 Selected Answer: B Upvotes: 9

This is the similar example, I would vote for B.
Use case of Stream Analytics
Query: Alert to trigger a business workflow
Let's make our query more detailed. For every type of sensor, we want to monitor average temperature per 30-second window and display results only if the average temperature is above 100 degrees.

https://learn.microsoft.com/en-us/azure/stream-analytics/stream-analytics-get-started-with-azure-stream-analytics-to-process-data-from-iot-devices

Comment 1.1

ID: 1577225 User: azuretrainer1 Badges: - Relative Date: 9 months ago Absolute Date: Fri 13 Jun 2025 17:12 Selected Answer: - Upvotes: 1

Great for real-time stream processing, but it requires writing queries and more setup.
Can complement detection, but not a minimal-effort solution for anomaly detection alone.

Comment 2

ID: 1221253 User: PeteColag Badges: Highly Voted Relative Date: 1 year, 9 months ago Absolute Date: Wed 29 May 2024 22:50 Selected Answer: C Upvotes: 9

The key requirements here are that you are doing this from a data stream and that you must limit development.
Anomaly detection does not work with datastreams natively, so considerable development work would be required to integrate this functionality. As a result, the correct answer is C and not D.

Comment 3

ID: 1699737 User: Mo2010 Badges: Most Recent Relative Date: 2 months, 3 weeks ago Absolute Date: Tue 16 Dec 2025 03:26 Selected Answer: B Upvotes: 1

You’re monitoring temperature data from a data stream, which points to a stream-processing solution.

Azure Stream Analytics (ASA) can ingest streaming data (IoT Hub, Event Hub, etc.) and has built-in anomaly detection functions (spike and dip detection).

This allows you to detect atypical temperature values and trigger alerts with minimal custom code, which matches the requirement to minimize development effort.

Comment 4

ID: 1628385 User: harshad883 Badges: - Relative Date: 3 months, 2 weeks ago Absolute Date: Tue 25 Nov 2025 20:01 Selected Answer: D Upvotes: 3

You are monitoring temperature data from a single sensor stream (one variable).
Univariate Anomaly Detection is designed for detecting anomalies in a single time-series signal.
It uses Azure Cognitive Services Anomaly Detector API, which minimizes development effort because:

No need to build or train your own ML model.
Simple REST API or SDK integration.


You can set thresholds and generate alerts easily.

Comment 5

ID: 1591459 User: akode Badges: - Relative Date: 7 months, 2 weeks ago Absolute Date: Tue 29 Jul 2025 12:57 Selected Answer: B Upvotes: 1

B. A and D are deprecated. Starting on the 20th of September, 2023 you won't be able to create new Anomaly Detector resources. The Anomaly Detector service is being retired on the 1st of October, 2026

Comment 6

ID: 1564408 User: marcellov Badges: - Relative Date: 10 months, 2 weeks ago Absolute Date: Mon 28 Apr 2025 13:44 Selected Answer: B Upvotes: 2

B. Azure Stream Analytics
While Univariate Anomaly Detection (D) is the concept of detecting anomalies in a single metric like temperature and is simpler than multivariate detection, the best way to minimize development effort in Azure is to use Azure Stream Analytics (B), which provides built-in univariate anomaly detection functions as part of its managed, scalable service. Stream Analytics allows you to process real-time data streams, detect anomalies with simple SQL-like queries, and integrate easily with alerting systems-all without building custom detection logic or managing infrastructure. Therefore, although univariate detection is the right approach conceptually, Azure Stream Analytics is the practical, low-effort solution to implement it end-to-end.
Source: AI

Comment 7

ID: 1366736 User: JituVin Badges: - Relative Date: 1 year ago Absolute Date: Sun 09 Mar 2025 03:13 Selected Answer: B Upvotes: 1

To monitor temperature data from a data stream and generate alerts for atypical values with minimal development effort, you should include:

B. Azure Stream Analytics

Azure Stream Analytics allows you to process and analyze real-time data streams with minimal development effort. You can define query logic to detect anomalies and trigger alerts based on the results. This makes it an ideal solution for generating alerts in response to atypical temperature values from your data stream.

Comment 8

ID: 1365431 User: Mattt Badges: - Relative Date: 1 year ago Absolute Date: Wed 05 Mar 2025 15:33 Selected Answer: D Upvotes: 3

Because you’re dealing with a single time-series (temperature) and you want minimal development overhead for spotting anomalies, the best answer is:

D. Univariate Anomaly Detection.

Comment 9

ID: 1363279 User: sonstevold Badges: - Relative Date: 1 year ago Absolute Date: Fri 28 Feb 2025 23:42 Selected Answer: C Upvotes: 1

Confirmed by multiple AI-services

Comment 10

ID: 1358361 User: ceris22962 Badges: - Relative Date: 1 year ago Absolute Date: Tue 18 Feb 2025 17:28 Selected Answer: C Upvotes: 1

Both ChatGPT and Copilot said.

Comment 11

ID: 1332781 User: pabsinaz Badges: - Relative Date: 1 year, 2 months ago Absolute Date: Sat 28 Dec 2024 05:52 Selected Answer: B Upvotes: 1

Azure Stream Analytics offers the most straightforward and efficient solution for real-time temperature monitoring with minimal development effort

Comment 12

ID: 1332378 User: hamidai102 Badges: - Relative Date: 1 year, 2 months ago Absolute Date: Fri 27 Dec 2024 12:31 Selected Answer: D Upvotes: 4

he correct answer is:

D. Univariate Anomaly Detection

Explanation:
Since the system is monitoring temperature data from a data stream, and the goal is to generate alerts based on atypical values (likely unusual temperature readings), Univariate Anomaly Detection is the most suitable solution.

Univariate Anomaly Detection focuses on identifying anomalies in a single time-series dataset. In this case, temperature data can be considered a univariate time-series, where the system detects unusual or atypical temperature values based on historical trends or thresholds. This minimizes development effort by using a simple approach to detect outliers in one variable (temperature) without requiring complex multi-variable or machine learning models.

Comment 13

ID: 1324061 User: nmlan Badges: - Relative Date: 1 year, 3 months ago Absolute Date: Mon 09 Dec 2024 14:28 Selected Answer: A Upvotes: 1

At first read, I felt like "temperature data" refers to the broader category of any variables included to monitor temperature (such as humidity, elevation, etc). This is why I noted Multivariate Anomaly Detection.

Azure Stream Analytics would require more configuration to build an anomaly detector so this doesn't sound like the best solution for "minimal development effort"

Comment 14

ID: 1322078 User: 3fbc31b Badges: - Relative Date: 1 year, 3 months ago Absolute Date: Wed 04 Dec 2024 21:56 Selected Answer: D Upvotes: 3

D.
Explanation:
The solution involves monitoring temperature data, which is typically a single-variable or univariate data stream. The Univariate Anomaly Detection service is ideal because:

Focuses on Single Variables:

It is optimized to detect anomalies in data streams consisting of a single variable, such as temperature readings.
Minimal Development Effort:

Azure's Anomaly Detector API includes Univariate Anomaly Detection and provides pre-trained models that require minimal customization or configuration.
You only need to feed the data stream into the API and analyze the results.
Efficient for Time Series Data:

It detects sudden spikes, dips, or trends in time series data, which aligns perfectly with monitoring temperature anomalies.

Comment 14.1

ID: 1322080 User: 3fbc31b Badges: - Relative Date: 1 year, 3 months ago Absolute Date: Wed 04 Dec 2024 21:56 Selected Answer: - Upvotes: 2

Why Not the Other Options?
A. Multivariate Anomaly Detection:

Multivariate Anomaly Detection is used for scenarios with multiple interdependent variables (e.g., temperature, pressure, and humidity). Since only temperature data is monitored here, this is unnecessary.
B. Azure Stream Analytics:

Stream Analytics is a powerful tool for real-time stream processing but requires more setup and custom query development. It does not provide prebuilt anomaly detection.
C. Metric Alerts in Azure Monitor:

Metric Alerts are for monitoring Azure resource metrics. They are not suitable for processing external or custom time series data streams, like temperature readings.

Comment 15

ID: 1319989 User: friendlyvlad Badges: - Relative Date: 1 year, 3 months ago Absolute Date: Sat 30 Nov 2024 02:23 Selected Answer: D Upvotes: 3

if your primary goal is to detect anomalies and generate alerts with minimal development effort, Anomaly Detection might be the better choice. However, if you need to perform more complex real-time data processing and analytics, Stream Analytics could be more suitable.

Comment 16

ID: 1314033 User: AL_everyday Badges: - Relative Date: 1 year, 3 months ago Absolute Date: Mon 18 Nov 2024 15:34 Selected Answer: B Upvotes: 2

Copoilot:
For a solution that monitors temperature data from a data stream and generates alerts in response to atypical values while minimizing development effort, B. Azure Stream Analytics is the most suitable option.

Here's why:

Azure Stream Analytics provides a fully managed service for real-time data stream processing.

It can easily integrate with other Azure services, making it straightforward to set up and scale.

Built-in anomaly detection functions help identify outliers in data without the need for extensive custom development.

Comment 17

ID: 1307387 User: Alan_CA Badges: - Relative Date: 1 year, 4 months ago Absolute Date: Tue 05 Nov 2024 14:53 Selected Answer: B Upvotes: 2

I asked Copilot :
Overall, Azure Stream Analytics offers a more comprehensive and integrated solution for your requirements, making it easier to set up, maintain, and scale.

And :
Starting on the 20th of September, 2023 you won’t be able to create new Anomaly Detector resources

5. AI-102 Topic 1 Question 51

Sequence
79
Discussion ID
108815
Source URL
https://www.examtopics.com/discussions/microsoft/view/108815-exam-ai-102-topic-1-question-51-discussion/
Posted By
Rob77
Posted At
May 9, 2023, 5:27 p.m.

Question

You have an app named App1 that uses an Azure Cognitive Services model to identify anomalies in a time series data stream.

You need to run App1 in a location that has limited connectivity. The solution must minimize costs.

What should you use to host the model?

  • A. Azure Kubernetes Service (AKS)
  • B. Azure Container Instances
  • C. a Kubernetes cluster hosted in an Azure Stack Hub integrated system
  • D. the Docker Engine

Suggested Answer

D

Answer Description Click to expand


Community Answer Votes

Comments 17 comments Click to expand

Comment 1

ID: 893250 User: Rob77 Badges: Highly Voted Relative Date: 2 years, 10 months ago Absolute Date: Tue 09 May 2023 17:27 Selected Answer: - Upvotes: 22

Docker - https://learn.microsoft.com/en-us/azure/cognitive-services/cognitive-services-container-support

Comment 2

ID: 1215083 User: vovap0vovap Badges: Highly Voted Relative Date: 1 year, 9 months ago Absolute Date: Tue 21 May 2024 18:14 Selected Answer: - Upvotes: 5

I might be stupid, but "data stream" and "limited connectivity" rendering out to me any cloud base staff. Which lead to Docker Engine

Comment 3

ID: 1578122 User: StelSen Badges: Most Recent Relative Date: 8 months, 4 weeks ago Absolute Date: Mon 16 Jun 2025 23:25 Selected Answer: D Upvotes: 1

D. Docker Engine (+ IoT Edge)
You containerize the Cognitive Service model and run it locally using Docker. By integrating with Azure IoT Edge, you can operate in intermittent/offline mode, with occasional connectivity just for billing sync. It meets the limited connectivity requirement and is cost-effective

Comment 4

ID: 1337423 User: MRGPAL Badges: - Relative Date: 1 year, 2 months ago Absolute Date: Tue 07 Jan 2025 04:19 Selected Answer: D Upvotes: 2

Docker Engine -->Ideal for lightweight, offline use--> Cost is low
Azure Kubernetes Service-->Requires cloud connectivity--> Cost is high

Comment 5

ID: 1297855 User: Sujeeth Badges: - Relative Date: 1 year, 4 months ago Absolute Date: Tue 15 Oct 2024 00:51 Selected Answer: - Upvotes: 2

The correct choice is D. the Docker Engine.
When running in a location with limited connectivity and aiming to minimize costs, Docker Engine is a lightweight and cost-effective option to host the model locally. You can containerize the Azure Cognitive Services model and run it using Docker without needing a cloud connection. This approach allows the app to function offline while minimizing infrastructure costs.
A. Azure Kubernetes Service (AKS) is more complex and typically requires cloud connectivity, making it less suitable for limited connectivity scenarios.
B. Azure Container Instances also requires cloud access and may not work well in offline environments.
C. A Kubernetes cluster in an Azure Stack Hub provides an on-premise cloud environment, but it is more costly and complex compared to running a model in Docker.

Comment 6

ID: 1275390 User: famco Badges: - Relative Date: 1 year, 6 months ago Absolute Date: Sat 31 Aug 2024 07:49 Selected Answer: - Upvotes: 1

Is this an AI question or an z-204 question. They talk about app hosting and then ask about the model. Ambigous. Just trying to make people fall into a trap (if you have to do this certification, you are already in that trap). The answer should not be azure hosting, because that will need connectivity. It has to be container services. I guess the wise-guy who put this excellent question wants Docker engine to be the answer. That's the best guess in this guessing-how-intelligent-microsoft-guys-are game

Comment 7

ID: 1235155 User: HaraTadahisa Badges: - Relative Date: 1 year, 8 months ago Absolute Date: Sat 22 Jun 2024 08:01 Selected Answer: D Upvotes: 3

D is the answer. It must be used Docker.

Comment 8

ID: 1231886 User: etellez Badges: - Relative Date: 1 year, 8 months ago Absolute Date: Mon 17 Jun 2024 12:43 Selected Answer: - Upvotes: 1

Copilot Says
C. a Kubernetes cluster hosted in an Azure Stack Hub integrated system

Explanation:

Azure Stack Hub is an extension of Azure that provides a way to run apps in an on-premises environment and deliver Azure services in your datacenter. It's designed for scenarios where you have limited or unreliable connectivity to the public cloud. By hosting the model on a Kubernetes cluster in an Azure Stack Hub integrated system, you can run your app in the location with limited connectivity. This solution also helps to minimize costs as you only pay for the Azure services that you use.

Comment 9

ID: 1231503 User: gary_cooper Badges: - Relative Date: 1 year, 8 months ago Absolute Date: Sun 16 Jun 2024 20:44 Selected Answer: D Upvotes: 2

Docker Engine

Comment 10

ID: 1230898 User: gary_cooper Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Sat 15 Jun 2024 12:55 Selected Answer: D Upvotes: 2

Docker seems right

Comment 11

ID: 1226730 User: monniq Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Sat 08 Jun 2024 14:31 Selected Answer: D Upvotes: 3

D seems most suitable

Comment 12

ID: 1225889 User: InfoMerp Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Fri 07 Jun 2024 05:21 Selected Answer: D Upvotes: 2

Says ChatGPT

Comment 13

ID: 1225621 User: UnknownUser Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Thu 06 Jun 2024 18:10 Selected Answer: D Upvotes: 3

To host your Azure Cognitive Services model in a location with limited connectivity while minimizing costs, you should use:

D. the Docker Engine

Here's why:

Docker Engine allows you to run containerized applications locally without requiring a full orchestration platform. This can be particularly useful in environments with limited connectivity where you need to run applications in a more lightweight and cost-effective manner.

Unlike Azure Kubernetes Service (AKS) or Azure Container Instances, which typically require internet connectivity to Azure for management and operation, Docker Engine can operate entirely offline once the container images are downloaded.

Using a Kubernetes cluster hosted in an Azure Stack Hub integrated system (Option C) could work but would be significantly more complex and costly compared to simply using Docker Engine.

Minimizing costs is a key consideration, and Docker Engine provides a straightforward, low-overhead solution compared to the other options.

Comment 14

ID: 1224082 User: emiliocb4 Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Tue 04 Jun 2024 13:04 Selected Answer: D Upvotes: 2

docker engine for the requisite of limited connectivity.... aks and aci must have internet connectivity... Azure Stack Hub is high costly

Comment 15

ID: 1219893 User: PeteColag Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Tue 28 May 2024 01:34 Selected Answer: D Upvotes: 3

Azure Kubernetes Service (AKS) and Azure Container Instances both require a consistent and reliable internet connection to Azure, which make them not be suitable for a location with limited connectivity. A Kubernetes cluster hosted in an Azure Stack Hub integrated system could work, but it might be overkill for your needs and potentially more expensive.
Excluding these leaves only D as the correct solution.

Comment 16

ID: 1217608 User: nanaw770 Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Fri 24 May 2024 16:46 Selected Answer: - Upvotes: 1

I would like to know the basis on which Azure Container Instances is the correct answer.

Comment 17

ID: 1207811 User: emiliocb4 Badges: - Relative Date: 1 year, 10 months ago Absolute Date: Tue 07 May 2024 11:55 Selected Answer: D Upvotes: 3

Docker is the correct response

6. AI-102 Topic 3 Question 61

Sequence
87
Discussion ID
135625
Source URL
https://www.examtopics.com/discussions/microsoft/view/135625-exam-ai-102-topic-3-question-61-discussion/
Posted By
GHill1982
Posted At
March 10, 2024, 7:29 a.m.

Question

You are developing an app that will use the Decision and Language APIs.

You need to provision resources for the app. The solution must ensure that each service is accessed by using a single endpoint and credential.

Which type of resource should you create?

  • A. Language
  • B. Speech
  • C. Azure Cognitive Services
  • D. Content Moderator

Suggested Answer

C

Answer Description Click to expand


Community Answer Votes

Comments 7 comments Click to expand

Comment 1

ID: 1220354 User: nanaw770 Badges: Highly Voted Relative Date: 1 year, 9 months ago Absolute Date: Tue 28 May 2024 16:35 Selected Answer: C Upvotes: 5

C is right answer, but now this service name changed "Azure AI".

Comment 2

ID: 1575461 User: prasioso Badges: Most Recent Relative Date: 9 months, 1 week ago Absolute Date: Sat 07 Jun 2025 11:29 Selected Answer: C Upvotes: 1

Deprecated question but might appear on exam in new form. Now bundled in Azure AI services multi-service account. Contains:
- Decision -> Content Moderator
- Language -> Language Understanding, Language, Translator
- Speech -> S2T, T2S, Speech Translation
- Vision -> Computer vision, Custom vision, Face
- Document Intelligence -> Document intelligence (studio)

Answer to Q would be Azure AI (multi) services.
Endpoint still is cognitive services.

Comment 3

ID: 1322625 User: chrillelundmark Badges: - Relative Date: 1 year, 3 months ago Absolute Date: Fri 06 Dec 2024 07:33 Selected Answer: C Upvotes: 3

This question is depricated. Azure Cognitive services is now Azure AI services. I can't find any info about the Decision API but Language API resides within the new AI Services.

So for the options:
A) Part of Azure AI Services
B) Part of Azure AI Services
C) Renamed to Azure AI Services
D) Part of Azure AI Services, but renamed to Content Safety

Comment 4

ID: 1235187 User: HaraTadahisa Badges: - Relative Date: 1 year, 8 months ago Absolute Date: Sat 22 Jun 2024 08:21 Selected Answer: C Upvotes: 2

I say this answer is C.

Comment 5

ID: 1229214 User: reigenchimpo Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Wed 12 Jun 2024 16:12 Selected Answer: C Upvotes: 1

C is answer.

Comment 6

ID: 1186766 User: Murtuza Badges: - Relative Date: 1 year, 11 months ago Absolute Date: Sun 31 Mar 2024 14:15 Selected Answer: C Upvotes: 1

The correct answer is C. Azure Cognitive Services.

When you create an Azure Cognitive Services resource, you get access to a suite of services and APIs, including the Decision and Language APIs, under a single endpoint and credential. This simplifies the management of these services and enhances security by reducing the number of credentials you need to manage. Other options like Language, Speech, and Content Moderator are individual services within Azure Cognitive Services. They do not provide a single endpoint and credential for accessing multiple services. Therefore, they do not meet the requirement specified in the question

Comment 7

ID: 1170078 User: GHill1982 Badges: - Relative Date: 2 years ago Absolute Date: Sun 10 Mar 2024 07:29 Selected Answer: C Upvotes: 3

A Cognitive Services multi-service resource allows you to access multiple Azure AI services with a single key and endpoint.

7. AI-102 Topic 1 Question 10

Sequence
91
Discussion ID
54752
Source URL
https://www.examtopics.com/discussions/microsoft/view/54752-exam-ai-102-topic-1-question-10-discussion/
Posted By
LKLK10
Posted At
June 6, 2021, 11:58 p.m.

Question

You successfully run the following HTTP request.
POST https://management.azure.com/subscriptions/18c51a87-3a69-47a8-aedc-a54745f708a1/resourceGroups/RG1/providers/
Microsoft.CognitiveServices/accounts/contoso1/regenerateKey?api-version=2017-04-18
Body{"keyName": "Key2"}
What is the result of the request?

  • A. A key for Azure Cognitive Services was generated in Azure Key Vault.
  • B. A new query key was generated.
  • C. The primary subscription key and the secondary subscription key were rotated.
  • D. The secondary subscription key was reset.

Suggested Answer

D

Answer Description Click to expand


Community Answer Votes

Comments 20 comments Click to expand

Comment 1

ID: 410057 User: LPreethi Badges: Highly Voted Relative Date: 4 years, 7 months ago Absolute Date: Tue 20 Jul 2021 08:31 Selected Answer: - Upvotes: 15

Answer is correct - Sample response will be,
{
"key1": "KEY1",
"key2": "KEY2"
}
This shows, Key1 is already there and this JSON request will generate Key2.

Comment 1.1

ID: 575277 User: ghoppa Badges: - Relative Date: 3 years, 11 months ago Absolute Date: Fri 25 Mar 2022 23:02 Selected Answer: - Upvotes: 5

Key2 has always been there. They are automatically created by default when you create a resource. You cannot delete them or add new ones, you can only regenerate/refresh them (poorly worded here as RESET). Still, correct answer is D in my opinion.

Comment 2

ID: 393210 User: azurelearner666 Badges: Highly Voted Relative Date: 4 years, 8 months ago Absolute Date: Mon 28 Jun 2021 21:26 Selected Answer: - Upvotes: 5

What a crap question! it's confusing account keys which they are regenerating with the Subscription keys... those are aCcount keys not "query keys"
The only matching option is "A. a key for cognitive services was generated"

But...
... it was RE-generated

the other options are even worse, "a query key was generated" (what this call does is regenerate an account key)
or c or d, they talk about "subscription keys", which there is nothing by that name.

Clearly the goal is to confuse, this question is as bad as it can be.

Comment 2.1

ID: 415237 User: YipingRuan Badges: - Relative Date: 4 years, 7 months ago Absolute Date: Tue 27 Jul 2021 08:59 Selected Answer: - Upvotes: 1

Yes Regenerates the specified "account key" for the specified Cognitive Services account.

Comment 3

ID: 1572641 User: man5484 Badges: Most Recent Relative Date: 9 months, 2 weeks ago Absolute Date: Tue 27 May 2025 12:21 Selected Answer: D Upvotes: 1

The regenerateKey API regenerates one of the two access keys (Key1 or Key2) used to authenticate against the Cognitive Services resource. These are subscription keys, not Azure Key Vault keys or query keys.

Comment 4

ID: 1331116 User: AmitSonal Badges: - Relative Date: 1 year, 2 months ago Absolute Date: Tue 24 Dec 2024 13:53 Selected Answer: D Upvotes: 1

You can use the REST API to regenerate keys:
POST https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.CognitiveServices/accounts/{accountName}/regenerateKey?api-version=2024-10-01
subscriptionId: Your Azure subscription ID.
resourceGroupName: The name of your resource group.
accountName: The name of your Cognitive Services account.
{
"keyName": "Key1" // or "Key2"
}

Comment 5

ID: 1295604 User: Christian_garcia_martin Badges: - Relative Date: 1 year, 5 months ago Absolute Date: Thu 10 Oct 2024 15:22 Selected Answer: - Upvotes: 1

make sense if endpoint is regenerate , the action retorned "was reset"

Comment 6

ID: 1264645 User: LanGo Badges: - Relative Date: 1 year, 7 months ago Absolute Date: Mon 12 Aug 2024 14:29 Selected Answer: - Upvotes: 1

The answer is D. Solution verified in Azure

Comment 7

ID: 1262721 User: Sylviaaaaaaa Badges: - Relative Date: 1 year, 7 months ago Absolute Date: Fri 09 Aug 2024 03:43 Selected Answer: - Upvotes: 1

D is correct. https://learn.microsoft.com/en-us/rest/api/aiservices/accountmanagement/accounts/regenerate-key?view=rest-aiservices-accountmanagement-2023-05-01&tabs=HTTP

Comment 8

ID: 1233512 User: nabou Badges: - Relative Date: 1 year, 8 months ago Absolute Date: Thu 20 Jun 2024 09:56 Selected Answer: - Upvotes: 2

solution is d, The regenerateKey operation is used to regenerate either the primary or the secondary key for an Azure Cognitive Services account. You specify which key to regenerate by providing the key name in the request body. Possible values for the key name are 'Key1' for the primary key and 'Key2' for the secondary key.

Comment 9

ID: 1232686 User: IMFaisalShah Badges: - Relative Date: 1 year, 8 months ago Absolute Date: Wed 19 Jun 2024 04:45 Selected Answer: - Upvotes: 1

D is the correct option.
The Api being shared is to regenerate the both primary and secondary keys based on key1 or key2 in the body. Clearly mentioned below:

https://learn.microsoft.com/en-us/rest/api/aiservices/accountmanagement/accounts/regenerate-key?view=rest-aiservices-accountmanagement-2023-05-01&tabs=HTTP

Comment 10

ID: 1219581 User: takaimomoGcup Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Mon 27 May 2024 15:23 Selected Answer: D Upvotes: 1

D is right answer.

Comment 11

ID: 1213811 User: reiwanotora Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Sun 19 May 2024 15:35 Selected Answer: D Upvotes: 1

D is right.

Comment 12

ID: 1213576 User: PeteColag Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Sun 19 May 2024 04:29 Selected Answer: D Upvotes: 1

D. The secondary subscription key was reset.

This response is indicated by the "keyName": "Key2" in the request body, specifying that the secondary subscription key (Key2) should be regenerated.

Comment 13

ID: 1209121 User: TJ001 Badges: - Relative Date: 1 year, 10 months ago Absolute Date: Fri 10 May 2024 02:38 Selected Answer: - Upvotes: 2

indeed this API call will regenerate the secondary key for the Cognitive Service Account..key1 remains same. could have been clear if it is mentioned as 'account' instead of 'subscription'

Comment 14

ID: 1204146 User: kiina Badges: - Relative Date: 1 year, 10 months ago Absolute Date: Mon 29 Apr 2024 19:33 Selected Answer: - Upvotes: 2

The HTTP POST request you mentioned is specifically for regenerating the key of an Azure Cognitive Services account, not the Azure subscription itself. Here’s a breakdown to clarify:

Azure Cognitive Services Key: This is the key you’re regenerating with the request. Cognitive Services keys are used to authenticate API requests for the various cognitive services provided by Azure, like computer vision, language understanding, and more.
Azure Subscription Key: This key is different; it’s associated with your Azure subscription and is used to manage billing and service usage across all services within Azure.
So, in summary, the request regenerates the secondary key (Key2) for the specified Cognitive Services account (contoso1), allowing continued access to Cognitive Services APIs with the new key. It does not affect the Azure subscription key. This is important for security practices, ensuring that if a key is compromised, you can regenerate it without interrupting services or access.

Comment 15

ID: 1173160 User: V3rgil Badges: - Relative Date: 1 year, 12 months ago Absolute Date: Thu 14 Mar 2024 07:05 Selected Answer: D Upvotes: 1

MS Learn video suggest its D

Comment 16

ID: 1168587 User: cp2323 Badges: - Relative Date: 2 years ago Absolute Date: Fri 08 Mar 2024 07:16 Selected Answer: - Upvotes: 1

I think answer is B.
on the answer D - I think because it says subscription, I dont think its resetting subscription key as this is more to do with the resource key

Comment 17

ID: 1147016 User: evangelist Badges: - Relative Date: 2 years, 1 month ago Absolute Date: Sun 11 Feb 2024 07:37 Selected Answer: D Upvotes: 3

D. The secondary subscription key was reset.

The request is to the Azure Management API to regenerate a key for an Azure Cognitive Services account (contoso1). The body of the request specifies {"keyName": "Key2"}, which indicates that the operation is targeted at the secondary subscription key (commonly referred to as Key2 in Azure Cognitive Services management). The regenerateKey action causes the specified key to be reset, generating a new key value for it while invalidating the old one.

Comment 17.1

ID: 1177853 User: Ody Badges: - Relative Date: 1 year, 11 months ago Absolute Date: Wed 20 Mar 2024 01:57 Selected Answer: - Upvotes: 1

I think your explanation is the most accurate. With respect to Azure Management API the are found under Subscription.

This question would have been easier for me if the answer had referred to it as an "account" key.

I think the answer is D.

8. AI-102 Topic 1 Question 56

Sequence
148
Discussion ID
112154
Source URL
https://www.examtopics.com/discussions/microsoft/view/112154-exam-ai-102-topic-1-question-56-discussion/
Posted By
973b658
Posted At
June 14, 2023, 10:58 a.m.

Question

You have an Azure subscription that contains an Anomaly Detector resource.

You deploy a Docker host server named Server1 to the on-premises network.

You need to host an instance of the Anomaly Detector service on Server1.

Which parameter should you include in the docker run command?

  • A. Fluentd
  • B. Billing
  • C. Http Proxy
  • D. Mounts

Suggested Answer

B

Answer Description Click to expand


Community Answer Votes

Comments 13 comments Click to expand

Comment 1

ID: 923807 User: Tin_Tin Badges: Highly Voted Relative Date: 2 years, 9 months ago Absolute Date: Thu 15 Jun 2023 09:05 Selected Answer: B Upvotes: 25

The answer is correct. Important

The Eula, Billing, and ApiKey options must be specified to run the container; otherwise, the container won't start. For more information, see Billing. The ApiKey value is the Key from the Keys and Endpoints page in the LUIS portal and is also available on the Azure Cognitive Services resource keys page.
https://learn.microsoft.com/en-us/azure/cognitive-services/luis/luis-container-configuration#example-docker-run-commands

Comment 2

ID: 1197927 User: trato Badges: Highly Voted Relative Date: 1 year, 10 months ago Absolute Date: Thu 18 Apr 2024 14:03 Selected Answer: B Upvotes: 6

docker run --rm -it -p 5000:5000 --memory 4g --cpus 1 \
mcr.microsoft.com/azure-cognitive-services/decision/anomaly-detector:latest \
Eula=accept \
Billing={ENDPOINT_URI} \
ApiKey={API_KEY}

https://learn.microsoft.com/en-us/azure/ai-services/anomaly-detector/anomaly-detector-container-howto#run-the-container-with-docker-run

Comment 3

ID: 1329448 User: Maddy2281 Badges: Most Recent Relative Date: 1 year, 2 months ago Absolute Date: Fri 20 Dec 2024 14:09 Selected Answer: C Upvotes: 1

B is more relatvent to cost management.

Comment 4

ID: 1319978 User: friendlyvlad Badges: - Relative Date: 1 year, 3 months ago Absolute Date: Sat 30 Nov 2024 01:16 Selected Answer: D Upvotes: 1

The docker run command has multiple --mount parameters. There is no parameter for Billing. The answer should be D.

Comment 5

ID: 1297856 User: Sujeeth Badges: - Relative Date: 1 year, 4 months ago Absolute Date: Tue 15 Oct 2024 01:04 Selected Answer: - Upvotes: 1

The correct choice is B. Billing.
When deploying an instance of the Anomaly Detector service on a Docker host like Server1, you need to specify the Billing parameter to link the on-premises containerized service to your Azure subscription for usage tracking and billing purposes. This ensures that even though the service is running locally, its usage is associated with your Azure subscription.
A. Fluentd is a log collector and isn't relevant for hosting Anomaly Detector.
C. Http Proxy would be used for network routing, but it's not directly related to the Anomaly Detector service setup.
D. Mounts refers to file system mounts in Docker and is not necessary for this type of service deployment.

Comment 6

ID: 1230904 User: gary_cooper Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Sat 15 Jun 2024 13:08 Selected Answer: B Upvotes: 1

Billing={ENDPOINT_URI}

Comment 7

ID: 1194117 User: michaelmorar Badges: - Relative Date: 1 year, 11 months ago Absolute Date: Fri 12 Apr 2024 05:29 Selected Answer: B Upvotes: 2

Always about the billing with Microsoft! :)

Comment 8

ID: 1149690 User: evangelist Badges: - Relative Date: 2 years ago Absolute Date: Wed 14 Feb 2024 01:25 Selected Answer: - Upvotes: 1

no billing no use

Comment 9

ID: 929209 User: Hisayuki Badges: - Relative Date: 2 years, 8 months ago Absolute Date: Wed 21 Jun 2023 10:28 Selected Answer: B Upvotes: 3

example)
$ docker run --rm -it -p 5000:5000 --memory 4g --cpus 2 --mount type=bind,src=c:\demo\container,target=/input --mount type=bind,src=C:\demo\container,target=/output
mcr.microsoft.com/azure-cognitive-services/luis Eula=accept Billing=https://westus.api.cognitive.microsoft.com/luis/v2.0 ApiKey={___YOUR_API_KEY___}

Comment 10

ID: 925782 User: user361836261 Badges: - Relative Date: 2 years, 8 months ago Absolute Date: Sat 17 Jun 2023 07:32 Selected Answer: - Upvotes: 2

Answer B is correct
https://learn.microsoft.com/en-us/azure/cognitive-services/luis/luis-container-configuration#configuration-settings

Comment 11

ID: 924580 User: endeesa Badges: - Relative Date: 2 years, 8 months ago Absolute Date: Thu 15 Jun 2023 22:38 Selected Answer: D Upvotes: 2

Billing is not a docker run parameter, answer is D

Comment 12

ID: 924280 User: nitz14 Badges: - Relative Date: 2 years, 8 months ago Absolute Date: Thu 15 Jun 2023 16:29 Selected Answer: D Upvotes: 1

To host an instance of the Anomaly Detector service on Server1 using Docker, you should include the following parameter in the docker run command:

D. Mounts

Mounts allow you to attach a directory or a file from the Docker host server (in this case, Server1) to the container. By including the appropriate mount configuration, you can provide the necessary files or directories required to run the Anomaly Detector service.

Comment 13

ID: 922910 User: 973b658 Badges: - Relative Date: 2 years, 9 months ago Absolute Date: Wed 14 Jun 2023 10:58 Selected Answer: D Upvotes: 1

It is D.

9. AI-102 Topic 1 Question 53

Sequence
156
Discussion ID
112132
Source URL
https://www.examtopics.com/discussions/microsoft/view/112132-exam-ai-102-topic-1-question-53-discussion/
Posted By
973b658
Posted At
June 14, 2023, 8:27 a.m.

Question

You are building a solution that will detect anomalies in sensor data from the previous 24 hours.

You need to ensure that the solution scans the entire dataset, at the same time, for anomalies.

Which type of detection should you use?

  • A. batch
  • B. streaming
  • C. change points

Suggested Answer

A

Answer Description Click to expand


Community Answer Votes

Comments 5 comments Click to expand

Comment 1

ID: 1223140 User: anto69 Badges: - Relative Date: 1 year, 3 months ago Absolute Date: Mon 02 Dec 2024 14:29 Selected Answer: A Upvotes: 1

ChatGPT: Batch detection

Comment 2

ID: 1217603 User: nanaw770 Badges: - Relative Date: 1 year, 3 months ago Absolute Date: Sun 24 Nov 2024 17:43 Selected Answer: A Upvotes: 1

A is right answer. From Takedajuku perspective, if you study for 4 days and spend 2 days reviewing, you will have a better chance of passing the exam.

Comment 3

ID: 1151741 User: evangelist Badges: - Relative Date: 1 year, 6 months ago Absolute Date: Fri 16 Aug 2024 04:22 Selected Answer: A Upvotes: 1

because it needs to process the whole 24 hours data, it has to be a batch

Comment 4

ID: 923784 User: Tin_Tin Badges: - Relative Date: 2 years, 2 months ago Absolute Date: Fri 15 Dec 2023 09:33 Selected Answer: A Upvotes: 1

A is correct. see https://learn.microsoft.com/en-us/azure/cognitive-services/anomaly-detector/overview

Comment 5

ID: 922802 User: 973b658 Badges: - Relative Date: 2 years, 2 months ago Absolute Date: Thu 14 Dec 2023 09:27 Selected Answer: A Upvotes: 1

A.
>solution scans the entire dataset

10. AI-102 Topic 1 Question 50

Sequence
193
Discussion ID
109884
Source URL
https://www.examtopics.com/discussions/microsoft/view/109884-exam-ai-102-topic-1-question-50-discussion/
Posted By
mVic
Posted At
May 22, 2023, 12:15 p.m.

Question

You are developing a monitoring system that will analyze engine sensor data, such as rotation speed, angle, temperature, and pressure. The system must generate an alert in response to atypical values.

What should you include in the solution?

  • A. Application Insights in Azure Monitor
  • B. metric alerts in Azure Monitor
  • C. Multivariate Anomaly Detection
  • D. Univariate Anomaly Detection

Suggested Answer

C

Answer Description Click to expand


Community Answer Votes

Comments 12 comments Click to expand

Comment 1

ID: 1149673 User: evangelist Badges: Highly Voted Relative Date: 2 years ago Absolute Date: Wed 14 Feb 2024 01:07 Selected Answer: C Upvotes: 12

if there is no metrics advisor, then choose Multivariate Anomaly detection as secondary option, Metrics Advisor is the best answer.

Comment 2

ID: 936140 User: Pixelmate Badges: Highly Voted Relative Date: 2 years, 8 months ago Absolute Date: Wed 28 Jun 2023 07:15 Selected Answer: - Upvotes: 11

This appeared in exam 28/06

Comment 3

ID: 1297853 User: Sujeeth Badges: Most Recent Relative Date: 1 year, 4 months ago Absolute Date: Tue 15 Oct 2024 00:46 Selected Answer: - Upvotes: 3

The correct choice is C. Multivariate Anomaly Detection.
In this scenario, multiple engine sensor data points (such as rotation speed, angle, temperature, and pressure) are being monitored together, and they are likely interdependent. Multivariate Anomaly Detection can analyze these related variables simultaneously to detect anomalies that arise from their combined patterns. This makes it well-suited for identifying atypical values in complex systems like engines, where relationships between different variables are critical to detecting anomalies.
A. Application Insights in Azure Monitor is mainly used for monitoring application performance and diagnostics.
B. Metric alerts in Azure Monitor track specific metrics, but they are usually based on single variables rather than multivariate relationships.
D. Univariate Anomaly Detection focuses on detecting anomalies in a single variable, which would not be as effective in this scenario where multiple interdependent variables are involved

Comment 4

ID: 1235156 User: HaraTadahisa Badges: - Relative Date: 1 year, 8 months ago Absolute Date: Sat 22 Jun 2024 08:02 Selected Answer: C Upvotes: 2

"Multivariate Anomaly Detection" is needed.

Comment 5

ID: 1232740 User: happychuks Badges: - Relative Date: 1 year, 8 months ago Absolute Date: Wed 19 Jun 2024 08:50 Selected Answer: - Upvotes: 1

Correct

Comment 6

ID: 1217605 User: nanaw770 Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Fri 24 May 2024 16:44 Selected Answer: C Upvotes: 1

C is right anwer.

Comment 7

ID: 1213788 User: reiwanotora Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Sun 19 May 2024 15:06 Selected Answer: C Upvotes: 1

C is right.

Comment 8

ID: 1151727 User: evangelist Badges: - Relative Date: 2 years ago Absolute Date: Fri 16 Feb 2024 04:41 Selected Answer: - Upvotes: 2

Please do not choose A, this is a machine learning topic and has nothing to do with Azure Monitor

Comment 9

ID: 1147892 User: evangelist Badges: - Relative Date: 2 years ago Absolute Date: Mon 12 Feb 2024 09:19 Selected Answer: C Upvotes: 3

C. Multivariate Anomaly Detection is the most comprehensive solution. It allows for the analysis of complex relationships between multiple metrics, which is crucial for accurately identifying anomalies in a system as intricate as an engine. This approach can help detect situations where the anomaly is not in the individual metrics but in their unexpected patterns or combinations.

Comment 10

ID: 925596 User: Hisayuki Badges: - Relative Date: 2 years, 8 months ago Absolute Date: Sat 17 Jun 2023 00:55 Selected Answer: C Upvotes: 4

Multivariate Anomaly Detection - If your goal is to detect system level anomalies from a group of time series data, use multivariate anomaly detection APIs.

Comment 11

ID: 924556 User: endeesa Badges: - Relative Date: 2 years, 8 months ago Absolute Date: Thu 15 Jun 2023 22:01 Selected Answer: A Upvotes: 3

Anomaly detection does not support alerting though

Comment 12

ID: 903913 User: mVic Badges: - Relative Date: 2 years, 9 months ago Absolute Date: Mon 22 May 2023 12:15 Selected Answer: C Upvotes: 4

The Multivariate Anomaly Detection APIs further enable developers by easily integrating advanced AI for detecting anomalies from groups of metrics, without the need for machine learning knowledge or labeled data.


https://learn.microsoft.com/en-us/azure/cognitive-services/anomaly-detector/overview#multivariate-anomaly-detection

11. AI-102 Topic 1 Question 13

Sequence
226
Discussion ID
60125
Source URL
https://www.examtopics.com/discussions/microsoft/view/60125-exam-ai-102-topic-1-question-13-discussion/
Posted By
SuperPetey
Posted At
Aug. 21, 2021, 9:46 a.m.

Question

You are developing a new sales system that will process the video and text from a public-facing website.
You plan to notify users that their data has been processed by the sales system.
Which responsible AI principle does this help meet?

  • A. transparency
  • B. fairness
  • C. inclusiveness
  • D. reliability and safety

Suggested Answer

A

Answer Description Click to expand


Community Answer Votes

Comments 18 comments Click to expand

Comment 1

ID: 428558 User: SuperPetey Badges: Highly Voted Relative Date: 4 years, 6 months ago Absolute Date: Sat 21 Aug 2021 09:46 Selected Answer: - Upvotes: 78

The correct answer is A, transparency: "When an AI application relies on personal data, such as a facial recognition system that takes images of people to recognize them; you should make it clear to the user how their data is used and retained, and who has access to it." from: https://docs.microsoft.com/en-us/learn/paths/prepare-for-ai-engineering/

Comment 2

ID: 432798 User: ayoitu Badges: Highly Voted Relative Date: 4 years, 6 months ago Absolute Date: Fri 27 Aug 2021 07:23 Selected Answer: - Upvotes: 16

"Transparency: AI systems should be understandable."
"Reliability and safety: AI systems should perform reliably and safely."
so the answer is correct, reliability and safety
https://docs.microsoft.com/en-us/azure/cloud-adoption-framework/strategy/responsible-ai

Comment 2.1

ID: 482261 User: TanujitRoy Badges: - Relative Date: 4 years, 3 months ago Absolute Date: Sat 20 Nov 2021 05:30 Selected Answer: - Upvotes: 2

Bhai mo ra net ta chalu ni asiki thk kari daba ki...sei azure ra gamucha tool ta nei asiba

Comment 3

ID: 1282297 User: AzureGeek79 Badges: Most Recent Relative Date: 1 year, 6 months ago Absolute Date: Wed 11 Sep 2024 21:15 Selected Answer: - Upvotes: 1

A seems correct. I just asked from ChatGPT and it says transparency.

Comment 4

ID: 1260895 User: moonlightc Badges: - Relative Date: 1 year, 7 months ago Absolute Date: Mon 05 Aug 2024 04:12 Selected Answer: - Upvotes: 1

A is the correct answer

Comment 5

ID: 1226699 User: milowac Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Sat 08 Jun 2024 13:10 Selected Answer: A Upvotes: 2

A is the right answer.

Comment 6

ID: 1219579 User: takaimomoGcup Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Mon 27 May 2024 15:21 Selected Answer: A Upvotes: 1

transparency is right answer.

Comment 7

ID: 1217872 User: PeteColag Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Sat 25 May 2024 01:31 Selected Answer: A Upvotes: 1

Transparence, because you are informing users that their data is being processed.

Comment 8

ID: 1152236 User: audlindr Badges: - Relative Date: 2 years ago Absolute Date: Fri 16 Feb 2024 22:41 Selected Answer: A Upvotes: 2

Transparency is the right answer.
Transparency: AI systems should be understandable.

Comment 9

ID: 1147033 User: evangelist Badges: - Relative Date: 2 years, 1 month ago Absolute Date: Sun 11 Feb 2024 08:23 Selected Answer: A Upvotes: 3

Notifying users that their data has been processed by the sales system helps meet the A. transparency principle. Transparency involves informing users about how their data is used, processed, and the purpose behind it, thereby fostering trust and understanding between the technology providers and the users.

Comment 10

ID: 1147024 User: evangelist Badges: - Relative Date: 2 years, 1 month ago Absolute Date: Sun 11 Feb 2024 08:00 Selected Answer: A Upvotes: 3

Notifying users that their data has been processed by the sales system helps meet the A. transparency principle. Transparency involves informing users about how their data is used, processed, and the purpose behind it, thereby fostering trust and understanding between the technology providers and the users.

Comment 11

ID: 1090012 User: orionduo Badges: - Relative Date: 2 years, 3 months ago Absolute Date: Thu 07 Dec 2023 08:06 Selected Answer: A Upvotes: 2

Transparency.
AI systems should be understandable. Users should be made fully aware of the purpose of the system, how it works, and what limitations may be expected.
REF: https://learn.microsoft.com/en-us/training/modules/prepare-to-develop-ai-solutions-azure/5-understand-considerations-for-responsible-ai

Comment 12

ID: 1002317 User: NickYog Badges: - Relative Date: 2 years, 6 months ago Absolute Date: Fri 08 Sep 2023 11:33 Selected Answer: B Upvotes: 1

System should treat eveyone faily

Comment 13

ID: 996133 User: Pinkshark Badges: - Relative Date: 2 years, 6 months ago Absolute Date: Fri 01 Sep 2023 16:28 Selected Answer: A Upvotes: 1

correct for me A

Comment 14

ID: 995784 User: kiro_kocha Badges: - Relative Date: 2 years, 6 months ago Absolute Date: Fri 01 Sep 2023 09:11 Selected Answer: A Upvotes: 1

Transparency offcourse. Users should be notified what information will be processed by the ai solution.

Comment 15

ID: 933725 User: savetheplanet Badges: - Relative Date: 2 years, 8 months ago Absolute Date: Sun 25 Jun 2023 17:37 Selected Answer: - Upvotes: 2

A
it's clearly transparency, why this website selects the wrong answer?

Comment 16

ID: 840844 User: marti_tremblay000 Badges: - Relative Date: 2 years, 12 months ago Absolute Date: Thu 16 Mar 2023 11:59 Selected Answer: A Upvotes: 1

ChatGPT answer :
The responsible AI principle that notifying users that their data has been processed by the sales system helps meet is A. transparency.

Transparency is the principle of making AI systems understandable and providing clear explanations of their decisions and actions. By notifying users that their data has been processed by the sales system, you are being transparent about the fact that their data is being used and how it is being used. This helps users understand how their data is being used and can help build trust between users and the sales system.

Comment 17

ID: 779114 User: ap1234pa Badges: - Relative Date: 3 years, 1 month ago Absolute Date: Tue 17 Jan 2023 18:13 Selected Answer: A Upvotes: 1

A is correct

12. AI-102 Topic 3 Question 63

Sequence
239
Discussion ID
135943
Source URL
https://www.examtopics.com/discussions/microsoft/view/135943-exam-ai-102-topic-3-question-63-discussion/
Posted By
-
Posted At
March 13, 2024, 5:26 p.m.

Question

You have an Azure subscription that contains an Azure App Service app named App1.

You provision a multi-service Azure Cognitive Services resource named CSAccount1.

You need to configure App1 to access CSAccount1. The solution must minimize administrative effort.

What should you use to configure App1?

  • A. a system-assigned managed identity and an X.509 certificate
  • B. the endpoint URI and an OAuth token
  • C. the endpoint URI and a shared access signature (SAS) token
  • D. the endpoint URI and subscription key

Suggested Answer

D

Answer Description Click to expand


Community Answer Votes

Comments 7 comments Click to expand

Comment 1

ID: 1275150 User: JakeCallham Badges: - Relative Date: 1 year, 6 months ago Absolute Date: Fri 30 Aug 2024 17:54 Selected Answer: D Upvotes: 3

D is right, but i sure hope that nobody uses keys anymore. its considered bad practice, use managed identities, rbac to do this.

Comment 2

ID: 1265828 User: anto69 Badges: - Relative Date: 1 year, 6 months ago Absolute Date: Wed 14 Aug 2024 17:00 Selected Answer: D Upvotes: 1

It's 100% D. Confirmed by ChatGPT too

Comment 3

ID: 1235184 User: HaraTadahisa Badges: - Relative Date: 1 year, 8 months ago Absolute Date: Sat 22 Jun 2024 08:20 Selected Answer: D Upvotes: 1

I say this answer is D. Please hurry up and transport the meat.

Comment 4

ID: 1229212 User: reigenchimpo Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Wed 12 Jun 2024 16:11 Selected Answer: D Upvotes: 1

D is answer.

Comment 5

ID: 1220353 User: nanaw770 Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Tue 28 May 2024 16:34 Selected Answer: D Upvotes: 2

D is right answer.

Comment 6

ID: 1196334 User: michaelmorar Badges: - Relative Date: 1 year, 11 months ago Absolute Date: Tue 16 Apr 2024 06:11 Selected Answer: - Upvotes: 2

In general, you always need an endpoint and subscription key.

Comment 7

ID: 1177771 User: Murtuza Badges: - Relative Date: 1 year, 11 months ago Absolute Date: Tue 19 Mar 2024 22:54 Selected Answer: - Upvotes: 2

By providing the endpoint URI and subscription key in your application, you can seamlessly connect to CSAccount1 without additional complexities or setup. This approach minimizes administrative overhead and ensures secure communication between your app and the cognitive services.

Therefore, the correct answer is D. the endpoint URI and subscription key.

13. AI-102 Topic 1 Question 68

Sequence
248
Discussion ID
135194
Source URL
https://www.examtopics.com/discussions/microsoft/view/135194-exam-ai-102-topic-1-question-68-discussion/
Posted By
GHill1982
Posted At
March 4, 2024, 9:22 p.m.

Question

You have an Azure subscription that contains an Azure AI Service resource named CSAccount1 and a virtual network named VNet1. CSAaccount1 is connected to VNet1.

You need to ensure that only specific resources can access CSAccount1. The solution must meet the following requirements:

• Prevent external access to CSAccount1.
• Minimize administrative effort.

Which two actions should you perform? Each correct answer presents part of the solution.

NOTE: Each correct answer is worth one point.

  • A. In VNet1, enable a service endpoint for CSAccount1.
  • B. In CSAccount1, configure the Access control (IAM) settings.
  • C. In VNet1, modify the virtual network settings.
  • D. In VNet1, create a virtual subnet.
  • E. In CSAccount1, modify the virtual network settings.

Suggested Answer

AE

Answer Description Click to expand


Community Answer Votes

Comments 19 comments Click to expand

Comment 1

ID: 1171395 User: chandiochan Badges: Highly Voted Relative Date: 2 years ago Absolute Date: Tue 12 Mar 2024 03:18 Selected Answer: AE Upvotes: 12

A. In VNet1, enable a service endpoint for CSAccount1. This allows you to secure your Azure service resources to the virtual network.

E. In CSAccount1, modify the virtual network settings. This will allow you to configure CSAccount1 to accept connections only from the virtual network VNet1.

Enabling service endpoints and modifying the virtual network settings for the AI Service resource will limit access to the resources within VNet1, effectively fulfilling both requirements.

Comment 2

ID: 1165977 User: GHill1982 Badges: Highly Voted Relative Date: 2 years ago Absolute Date: Mon 04 Mar 2024 21:22 Selected Answer: AE Upvotes: 5

In VNet1, enable a service endpoint for CSAccount1. This will allow you to connect your virtual network to your Azure AI Service resource securely over the Azure backbone network.

In CSAccount1, modify the virtual network settings. This will allow you to configure virtual network rules that specify which subnets can access your Azure AI Service resource.

Comment 3

ID: 1265575 User: anto69 Badges: Most Recent Relative Date: 1 year, 7 months ago Absolute Date: Wed 14 Aug 2024 10:33 Selected Answer: AE Upvotes: 1

ChatGPT confirms A and E

Comment 4

ID: 1235669 User: rookiee1111 Badges: - Relative Date: 1 year, 8 months ago Absolute Date: Sun 23 Jun 2024 03:51 Selected Answer: AE Upvotes: 2

A - creating a service endpoint for csaccount1 on vnet1 ensures that its the only way of accessing the service and it ensures a secure connection
E - adding network settings on service csaccount1 will ensure that the access is restricted to resources within vnet1

Comment 5

ID: 1232361 User: Nat69 Badges: - Relative Date: 1 year, 8 months ago Absolute Date: Tue 18 Jun 2024 12:10 Selected Answer: - Upvotes: 1

To ensure that only specific resources can access CSAccount1 and prevent external access while minimizing administrative effort, you should perform the following actions:

A. In VNet1, enable a service endpoint for CSAccount1.
E. In CSAccount1, modify the virtual network settings.

Explanation:
Enable a Service Endpoint for CSAccount1 (Action A):

Service endpoints provide direct connectivity to Azure services over an optimized route over the Azure backbone network. By enabling a service endpoint for CSAccount1 on VNet1, you ensure that the traffic between VNet1 and CSAccount1 does not go over the internet, enhancing security and meeting the requirement to prevent external access.
Modify the Virtual Network Settings in CSAccount1 (Action E):

Configuring the virtual network settings in CSAccount1 allows you to specify which subnets within VNet1 can access the AI service. This way, you can control access at a more granular level and ensure that only specific resources within those subnets can access CSAccount1.

Comment 6

ID: 1225358 User: p2006 Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Thu 06 Jun 2024 12:19 Selected Answer: AD Upvotes: 1

https://learn.microsoft.com/en-us/azure/ai-services/cognitive-services-virtual-networks?tabs=portal#configure-virtual-network-rules

Comment 7

ID: 1224760 User: Belicova Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Wed 05 Jun 2024 15:57 Selected Answer: AD Upvotes: 1

From copilot :
To achieve the desired requirements of preventing external access to CSAccount1 while minimizing administrative effort, consider the following actions:

Configure network rules for CSAccount1:
Go to the Azure AI services resource you want to secure.
Under Firewalls and virtual networks, select Selected Networks and Private Endpoints.
Deny access by default to all networks, including internet traffic.
Then, configure rules to grant access only to traffic from specific virtual networks1.
Create a virtual subnet in VNet1:
This allows you to isolate resources within a specific subnet, ensuring that only authorized traffic can reach CSAccount1.
Therefore, the correct answers are A. In VNet1, enable a service endpoint for CSAccount1 and D. In VNet1, create a virtual subnet. These actions align with the requirements and minimize administrative overhead.

Comment 8

ID: 1224482 User: anto69 Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Wed 05 Jun 2024 04:58 Selected Answer: AE Upvotes: 1

ChatGPT: A and E

Comment 9

ID: 1221352 User: PeteColag Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Thu 30 May 2024 02:33 Selected Answer: - Upvotes: 1

Dervices like Azure AI Search, Video Indexer, and Immersive Reader do not support VNet settings configuration. For such services, E is not a viable response.

Comment 10

ID: 1217583 User: nanaw770 Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Fri 24 May 2024 16:29 Selected Answer: AE Upvotes: 1

A and E.

Comment 11

ID: 1213773 User: reiwanotora Badges: - Relative Date: 1 year, 9 months ago Absolute Date: Sun 19 May 2024 14:34 Selected Answer: AE Upvotes: 1

I will also vote for AE.

Comment 12

ID: 1203242 User: AzureGC Badges: - Relative Date: 1 year, 10 months ago Absolute Date: Sat 27 Apr 2024 20:35 Selected Answer: AE Upvotes: 2

AE ...

Do NOT think B is correct: The IAM controls do not necessarily help if the endpont and key are compromised; Have to use VNET controls to gate the service endpoints;

Comment 13

ID: 1202668 User: Jimmy1017 Badges: - Relative Date: 1 year, 10 months ago Absolute Date: Fri 26 Apr 2024 17:07 Selected Answer: - Upvotes: 3

A. In VNet1, enable a service endpoint for CSAccount1.
B. In CSAccount1, configure the Access control (IAM) settings.

Explanation:

A. Enabling a service endpoint for CSAccount1 in VNet1 allows traffic from the virtual network to reach CSAccount1 without traversing the public internet, thus preventing external access.

B. Configuring the Access control (IAM) settings in CSAccount1 allows you to specify which specific resources or identities have access to CSAccount1. By configuring these settings, you can ensure that only specific resources can access CSAccount1, meeting the requirement to restrict access.

Comment 14

ID: 1192753 User: franceshuang Badges: - Relative Date: 1 year, 11 months ago Absolute Date: Wed 10 Apr 2024 08:35 Selected Answer: - Upvotes: 1

AE should be right

Comment 15

ID: 1192321 User: TT924 Badges: - Relative Date: 1 year, 11 months ago Absolute Date: Tue 09 Apr 2024 16:09 Selected Answer: AB Upvotes: 2

A. Enable a service endpoint for Azure AI services within the virtual network. The service endpoint routes traffic from the virtual network through an optimal path to the Azure AI service.
https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/role-based-access-control

B. Has default multiple Role-base access: Cognitive Services OpenAI User, Cognitive Services OpenAI Contributor, Cognitive Services Contributor, Cognitive Services Usages Reader

You can also set up Azure RBAC for whole resource groups, subscriptions, or management groups. Do this by selecting the desired scope level and then navigating to the desired item. For example, selecting Resource groups and then navigating to a specific resource group.

Select Access control (IAM) on the left navigation pane.

https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/role-based-access-control

Comment 15.1

ID: 1192336 User: TT924 Badges: - Relative Date: 1 year, 11 months ago Absolute Date: Tue 09 Apr 2024 16:39 Selected Answer: - Upvotes: 1

A , should be https://learn.microsoft.com/en-us/azure/search/service-create-private-endpoint

Comment 16

ID: 1187882 User: NullVoider_0 Badges: - Relative Date: 1 year, 11 months ago Absolute Date: Tue 02 Apr 2024 08:25 Selected Answer: AE Upvotes: 3

A. In VNet1, enable a service endpoint for CSAccount1. Enabling a service endpoint for CSAccount1 in VNet1 will allow you to secure the Azure AI Service resource to a specific subset of networks. This means that only the applications requesting data over VNet1 will be able to access CSAccount1. It’s a way to ensure that the resource is only accessible from within the virtual network.

E. In CSAccount1, modify the virtual network settings. By modifying the virtual network settings in CSAccount1, you can configure network rules that limit access to the resource. You would set the default network access rule to deny access to all networks, including the internet. Then, you can specify which virtual networks or subnets are allowed to access CSAccount1.

Comment 16.1

ID: 1190214 User: Training Badges: - Relative Date: 1 year, 11 months ago Absolute Date: Sat 06 Apr 2024 05:53 Selected Answer: - Upvotes: 1

Where does it mention that Azure AI services have support for Azure Vnet Service endpoints

Comment 17

ID: 1175013 User: Murtuza Badges: - Relative Date: 1 year, 12 months ago Absolute Date: Sat 16 Mar 2024 15:27 Selected Answer: AB Upvotes: 4

the correct answers are A (enable a service endpoint for CSAccount1) and B (configure the Access control (IAM) settings). These actions provide a secure and efficient solution for restricting access to CSAccount1 while minimizing administrative overhead.

14. AI-102 Topic 1 Question 44

Sequence
318
Discussion ID
106675
Source URL
https://www.examtopics.com/discussions/microsoft/view/106675-exam-ai-102-topic-1-question-44-discussion/
Posted By
mgafar
Posted At
April 19, 2023, 7:36 a.m.

Question

SIMULATION -
Use the following login credentials as needed:
To enter your username, place your cursor in the Sign in box and click on the username below.
To enter your password, place your cursor in the Enter password box and click on the password below.

Azure Username: [email protected] -

Azure Password: XXXXXXXXXXXX -
The following information is for technical support purposes only:

Lab Instance: 12345678 -

Task -
You need to ensure that a user named [email protected] can regenerate the subscription keys of AAA12345678. The solution must use the principle of least privilege.
To complete this task, sign in to the Azure portal.

Suggested Answer

image
Answer Description Click to expand


Comments 7 comments Click to expand

Comment 1

ID: 917336 User: ziggy1117 Badges: Highly Voted Relative Date: 2 years, 9 months ago Absolute Date: Wed 07 Jun 2023 17:17 Selected Answer: - Upvotes: 13

Cognitive Services Contributor
Lets you create, read, update, delete and manage keys of Cognitive Services.

Comment 2

ID: 1217612 User: nanaw770 Badges: Most Recent Relative Date: 1 year, 9 months ago Absolute Date: Fri 24 May 2024 16:48 Selected Answer: - Upvotes: 2

Simulation questions will not appear on the actual exam as of May 25, 2024; ET should remove this type of question.

Comment 3

ID: 993900 User: M25 Badges: - Relative Date: 2 years, 6 months ago Absolute Date: Wed 30 Aug 2023 11:31 Selected Answer: - Upvotes: 2

https://learn.microsoft.com/en-us/azure/role-based-access-control/role-assignments-portal

https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#cognitive-services-user
Classic Storage Account Key Operator Service Role:
Classic Storage Account Key Operators are allowed to list and regenerate keys on Classic Storage Accounts

Comment 4

ID: 874301 User: mgafar Badges: - Relative Date: 2 years, 10 months ago Absolute Date: Wed 19 Apr 2023 07:36 Selected Answer: - Upvotes: 3

1. Sign in to the Azure portal (https://portal.azure.com/) using your account credentials. 2. In the left-hand navigation menu, click on "All services" and search for "Subscriptions." Click on the "Subscriptions" service to open the list of your Azure subscriptions. 3. Find the subscription with the ID "AAA12345678" and click on it to open the subscription details page. 4. In the left-hand navigation menu of the subscription details page, click on "Access control (IAM)." 5. Click on the "+ Add" button to add a new role assignment. This will open the "Add role assignment" pane. 6. In the "Role" dropdown menu, search for and select the "User Access Administrator" role. This role allows a user to manage access to Azure resources, including the ability to manage subscription keys, while adhering to the principle of least privilege. 7. In the "Select" field, type "[email protected]" and select the user from the list of suggestions. 8. Click on the "Save" button to complete the role assignment process.

Comment 4.1

ID: 1132528 User: AnonymousJhb Badges: - Relative Date: 2 years, 1 month ago Absolute Date: Fri 26 Jan 2024 14:11 Selected Answer: - Upvotes: 1

NEVER! this question asks for principle of least privilege.
you do not apply the RBAC IAM role at the subscription level.
you need to drill down past the resource group and directly on the resource, go to IAM > and add the user account as a Cognitive Services Contributor

Comment 4.2

ID: 917335 User: ziggy1117 Badges: - Relative Date: 2 years, 9 months ago Absolute Date: Wed 07 Jun 2023 17:16 Selected Answer: - Upvotes: 1

user access administrator does not allow you to regenerate keys. but it allows you to add users

Comment 4.2.1

ID: 965518 User: Jo_Hannes Badges: - Relative Date: 2 years, 7 months ago Absolute Date: Fri 28 Jul 2023 13:15 Selected Answer: - Upvotes: 1

Cognitive Services Contributor
Lets you create, read, update, delete and manage keys of Cognitive Services.