Sorry, you are out of time.
AZ 900 Practice Exam 1
Take your exam preparation to the next level with fully simulated online practice tests designed to replicate the real exam experience. These exams feature realistic questions, timed conditions, and detailed explanations to help you assess your knowledge, identify weak areas, and build confidence before test day.
1. A healthcare organization is implementing MFA in Azure for compliance with regulatory standards that require “something you know and something you have” for access. They want to reduce dependency on passwords but still meet the regulatory requirements. Which MFA configuration would best meet their needs?
Answer: B. Traditional password plus Microsoft Authenticator notification (B) is the best option to satisfy the requirement of “something you know and something you have,” as it involves a password and a separate factor through the Authenticator app, providing both security and regulatory compliance. Passwordless sign-in (A) does not involve “something you know” and thus may not fully meet the regulatory requirement. SMS-based authentication only (C) lacks the necessary second factor. Conditional Access with IP-based restrictions (D) improves access control but does not provide a second authentication factor.
2. A financial institution is implementing Zero Trust in its Azure environment to ensure that only verified users and devices can access high-sensitivity applications. They require Multi-Factor Authentication (MFA) for access, particularly for users outside the corporate network. Which configuration best supports this Zero Trust approach?
Answer: A. Configure Conditional Access policies with MFA for high-risk logins and external locations (A) is the best choice as it enforces strict authentication controls, verifying both user identity and login conditions, in line with Zero Trust’s “never trust, always verify” principle. SSO (B) simplifies access but does not inherently ensure the extra verification needed for Zero Trust. Assigning the Owner role (C) grants excessive permissions and contradicts the principle of least privilege in Zero Trust. Relying solely on password policies (D) does not provide the verification level required, as it lacks a second factor for authentication.
3. An organization’s application logs need to be monitored to quickly identify and troubleshoot any application errors impacting users. The team decides to use Log Analytics but is unsure how to set up proactive monitoring. Which approach would best help them catch and investigate errors efficiently?
Answer: A Explanation: The team should configure a query in Log Analytics Workspace (A) to run periodically and trigger alerts based on specific error codes or exception logs, enabling proactive error detection and troubleshooting. Log Analytics provides custom queries and alerting capabilities tailored to application log monitoring. Option (B) focuses on cost rather than error tracking. Option (C), Application Insights, captures performance metrics but does not target specific error log monitoring. Option (D), Service Health, tracks Azure service status but doesn’t monitor application-specific logs.
4. A multinational company needs to migrate its internal data analytics systems to the cloud. They require a setup that allows them to scale resources up or down based on analytics workload demands while maintaining control over the OS and network configurations. Cost-effectiveness and rapid scaling for resource-intensive tasks are key considerations. Which Azure service model should the company use?
Answer: B. Infrastructure as a Service (IaaS) Explanation: IaaS (B) is ideal for scenarios where the organization needs scalability for compute-intensive workloads along with the ability to manage OS, storage, and network configurations. In IaaS, the company can increase resources when the analytics workload is high and scale down during lower demand, thereby optimizing costs effectively. PaaS (A) supports rapid deployment but does not provide as much control over infrastructure components, which limits flexibility in configuring analytics environments. SaaS (C) focuses on fully managed applications, unsuitable for highly configurable data analytics setups. FaaS (D) is better suited for event-driven workloads, not continuous analytics requiring control over underlying resources.
5. A streaming service wants to integrate Azure AD B2C for customer logins and provide a tailored experience based on user location and subscription level. How can they use Azure AD B2C to implement this conditional access to the platform?
Answer: A. Custom Policies with Conditional Access (A) allow the streaming service to define rules for the user experience based on factors like location and subscription level, providing a tailored login and access experience. Role-Based Access Control (RBAC) (B) is used to control access within Azure but does not provide the customizable customer experience Azure AD B2C offers. Azure AD Connect with Pass-through Authentication (C) supports identity synchronization but is not relevant to customizing customer login. Multi-Factor Authentication (MFA) (D) enhances security but does not create tailored access experiences.
6. Your organization requires periodic documentation of your Azure environment's resource hierarchy and configuration settings for compliance purposes. You need to export and report on these resources directly from the Azure portal in a manageable format. Which feature should you utilize to extract a structured list of resources with their configurations and relationships?
Correct Answer: B Explanation: The correct answer is B. Resource Graph Explorer allows for querying Azure resources with flexibility and exporting data in a structured format, making it ideal for compliance and reporting on configurations and relationships. Azure Advisor (A) provides best-practice recommendations but doesn’t support exporting comprehensive resource lists. ARM Templates in Deployment History (C) are primarily for tracking deployment configurations but are limited to templates rather than a complete inventory. Cost Management + Billing (D) focuses on financial data and lacks resource hierarchy or configuration insights.
7. A tech startup has a hybrid infrastructure and plans to move several applications reliant on legacy authentication protocols to Azure. However, they want to minimize the administrative effort of setting up and managing their own Active Directory in Azure. How can Microsoft Entra Domain Services help them maintain these legacy authentication requirements without managing a full Active Directory deployment?
Answer: A. By providing a managed domain with Kerberos and NTLM support (A), Microsoft Entra Domain Services allows the startup to migrate legacy applications to Azure while maintaining compatibility with required authentication protocols without the overhead of managing an Active Directory infrastructure. Synchronization with Microsoft Entra ID Connect (B) replicates identities but does not provide Kerberos or NTLM support in a managed environment. Enforcing multi-factor authentication (C) adds security but does not address legacy authentication requirements. Role-Based Access Control (RBAC) (D) assigns permissions but does not support legacy authentication methods like Kerberos or NTLM.
8. A global logistics company relies on Azure for its mission-critical tracking application. They need to ensure predictable latency for users worldwide while maintaining a reliable service, even during high traffic periods. What would be the best way to meet these needs on Azure?
Correct Answer: A Explanation: Option (A) is correct because deploying the application in multiple regions and using Traffic Manager with performance routing optimizes latency and reliability by directing users to the nearest region, improving service predictability. This setup ensures consistent performance and fault tolerance by avoiding reliance on a single location. Option (B) is incorrect as a single VM in one region cannot ensure predictable performance for a global user base and may be susceptible to regional failures. Option (C) is incorrect as Azure Functions are best suited for stateless tasks and do not address latency needs. Option (D) is incorrect because Azure CDN only manages static content, while the application requires global, real-time interaction that CDN alone cannot address.
9. A software company has multiple projects under different departments, each requiring its own development, testing, and production environments on Azure. They want to ensure that each department’s resources are easily identifiable and managed independently while applying consistent policies at the organizational level. What is the most effective hierarchy to implement for this requirement?
Correct Answer: B Explanation: Setting up a management group for each department and organizing development, testing, and production subscriptions within each management group (B) enables the company to apply consistent organizational policies at the root management level while allowing each department to manage its resources independently. This hierarchy provides clear segmentation and control across project environments. Option A (separate resource groups in a single subscription) would not provide isolation at the subscription level, limiting resource management. Option C (tags within a single subscription) lacks the isolation required for independent management. Option D (using a single resource group for all subscriptions) undermines policy enforcement and resource organization, making B the most practical solution.
10. A company needs to ensure that files uploaded to Azure using AzCopy are stored with the correct metadata. Specifically, they want all files to have a designated content type. Which AzCopy command option would allow them to set this metadata during the transfer process?
Answer: C. /set-content-type Explanation: The /set-content-type option (C) in AzCopy allows users to specify the content type for files during transfer, ensuring that the correct metadata is applied directly within Azure Blob Storage. This feature is particularly useful for managing file properties, such as specifying file formats like “image/jpeg” or “application/pdf.” The /resume (A) option enables continuation of an interrupted transfer, but it does not handle metadata. The /sync (B) command is for synchronizing files and does not include metadata options. The /cap-mbps (D) command limits the transfer speed but does not impact content type or metadata, making /set-content-type the correct choice.
11. A media company runs a streaming service and requires a scalable infrastructure to handle high user demand during live broadcasts. They need to configure their Virtual Machine Scale Sets so that if a VM fails or becomes unresponsive, another VM is automatically deployed to maintain service reliability. Which configuration should they use to ensure this level of fault tolerance?
Answer: B. Deploy across multiple fault domains Explanation: Deploying across multiple fault domains (B) within a Virtual Machine Scale Set ensures fault tolerance by distributing VM instances across physically separate hardware within the Azure datacenter, reducing the risk of downtime if one set of hardware fails. This setup enhances service reliability for applications like streaming, where uninterrupted availability is crucial. Using a single fault domain (A) would concentrate the VMs in one hardware cluster, increasing the risk of all VMs going down simultaneously. Horizontal scaling (C) provides increased capacity but does not enhance fault tolerance without multiple fault domains. Spot Instances (D) are cost-effective but lack guaranteed availability, making them unsuitable for high-demand live broadcasting.
12. An organization is moving its applications to Azure and needs to migrate several databases from an on-premises SQL Server to Azure SQL Database. They require a solution that ensures minimal downtime during the migration. Which tool should they use to accomplish this?
Answer: B. Azure Database Migration Service Explanation: Azure Database Migration Service (B) is designed for database migrations, offering options for minimal-downtime migrations to Azure SQL Database. It allows the organization to migrate data while minimizing impact on operations, making it the optimal choice for scenarios requiring near-continuous availability. Azure Data Box (A) is a physical transfer device, not suitable for databases requiring real-time migration. AzCopy (C) is a command-line tool for file transfers, not for structured database migration. Azure Storage Explorer (D) manages Azure Storage resources but does not support database migration.
13. An educational institution wants to provide students access to its online portal but requires an additional layer of security if students access the portal from unfamiliar or high-risk locations. They want to use Microsoft Entra ID Conditional Access for this purpose. Which policy configuration would best meet this requirement?
Answer: A. Conditional Access with Risk-Based Multi-Factor Authentication (MFA) (A) is the most appropriate solution, as it applies an additional security layer by enforcing MFA when students access the portal from unfamiliar or high-risk locations, increasing protection based on access risk. Location-based access restrictions in RBAC (B) are not supported, as RBAC focuses on role permissions rather than location-based security. Passwordless Authentication for high-risk locations (C) does not add an extra security layer based on access risk. Device-only Conditional Access policies (D) ensure device compliance but do not address high-risk locations or MFA.
14. An organization requires that each new resource deployed in Azure adhere to specific configurations, including predefined tags and location settings, to ensure governance across the environment. They want to enforce these settings through infrastructure as code rather than manually applying configurations. Which combination of Azure tools will best help them accomplish this?
Correct Answer: A Explanation: The correct answer is A. ARM Templates can define and deploy resources with specific configurations, including tags and location settings, as code, while Azure Policy can enforce compliance by ensuring these configurations are applied automatically and consistently. This combination allows the organization to use infrastructure as code while maintaining governance. Azure Automation with Custom Scripts (B) provides automation but lacks policy enforcement and governance. Azure Resource Groups with Manual Tagging (C) requires manual intervention, reducing consistency. Azure Security Center (D) focuses on security posture but does not enforce deployment configurations for governance purposes.
15. An organization has created separate virtual networks for different environments, such as development, testing, and production, within the same Azure region. They need these VNets to communicate securely for shared resource access while maintaining isolated environments. Which solution best meets this requirement?
Answer: B. Use Virtual Network Peering between VNets Explanation: Virtual Network Peering (B) enables private communication between VNets within the same or different Azure regions, allowing development, testing, and production VNets to communicate securely while maintaining isolation. VNet peering creates a direct link between VNets without requiring exposure to the public internet, fulfilling the organization’s need for isolated yet connected environments. Assigning Public IPs (A) would make resources accessible over the internet, reducing security. Network Security Groups (NSGs) (C) manage traffic flow but do not create direct connectivity between VNets. Azure VPN Gateway (D) facilitates connections between on-premises networks and Azure, not between VNets within Azure.
16. A technology company wants to grant temporary access to an external vendor to review and manage certain Azure resources related to a project. The vendor should use their existing business credentials, and the access should be automatically removed after 30 days. What Azure configuration would best meet this requirement?
Answer: A. Azure AD B2B with access expiration policy (A) is the best solution, as Azure AD B2B allows vendors to use their existing credentials, and an access expiration policy can be configured to remove access automatically after 30 days. Azure AD Connect (B) synchronizes internal identities but does not handle temporary external access. Multi-Factor Authentication (MFA) (C) increases security but does not provide automatic expiration. Privileged Identity Management (PIM) (D) allows for time-based access to privileged roles but does not specifically support temporary external access with expiration policies.
17. A retail company aims to adopt a cloud-based disaster recovery solution to ensure business continuity for their e-commerce platform. They require rapid failover to a backup environment without incurring the costs of maintaining a complete on-premises backup infrastructure. Which characteristic of cloud computing provides the most suitable solution?
Answer: A. Explanation: Disaster Recovery as a Service (DRaaS) (A) is the best cloud solution for this scenario, as it allows the retail company to have a cloud-based backup environment that can quickly be activated in the event of a disaster, ensuring minimal downtime without the expense of maintaining physical backup infrastructure. DRaaS provides automated failover and failback capabilities, which is essential for e-commerce where continuous availability is critical. (B) Infrastructure as a Service (IaaS) provides scalable resources but lacks the specific disaster recovery functionalities required for rapid failover. (C) Serverless Computing is highly flexible for application deployment but is not a direct solution for disaster recovery and failover. (D) Virtual Machines could be part of a custom disaster recovery solution but do not inherently provide the fully managed, fast failover capabilities that DRaaS offers.
18. A research institution has a tight budget and plans to perform non-critical data analysis that requires large computational resources. The analysis can be paused or delayed without impacting results. Which Azure pricing model would allow the institution to perform this analysis at the lowest cost?
Answer: B. Explanation: Spot Instances (B) are the best fit for the research institution because they provide the lowest cost for computational resources, with the trade-off that they are interruptible. Since the analysis is non-critical and can be paused or delayed, Spot Instances offer significant savings, making them ideal. (A) Pay-As-You-Go provides flexibility without interruption but at a higher cost. (C) Reserved Instances are useful for predictable, ongoing workloads, but the upfront commitment is not ideal for this occasional analysis. (D) Free Tier has limited resources and would not support the scale required for large computational tasks.
19. A financial institution requires a virtual machine solution that offers guaranteed compute resources and stable pricing for a consistent workload running 24/7. The organization is looking to minimize costs over a multi-year period. Which Azure virtual machine pricing model would best fit their needs?
Answer: B. Reserved Instances Explanation: Reserved Instances (B) allow customers to lock in compute resources at a lower cost for a one- or three-year commitment, making them highly cost-effective for stable, long-term workloads running continuously. This model provides predictable billing, which is ideal for financial institutions with ongoing needs. Pay-As-You-Go (A) offers flexibility but is more expensive for 24/7 usage. Spot Instances (C) are low-cost but subject to deallocation, unsuitable for guaranteed uptime. Dev/Test Subscriptions (D) are intended for non-production environments and would not provide the stability required for a production financial workload.
20. A university plans to use cloud services for its students, enabling remote access to resources like course materials and research applications. However, it also needs to ensure that sensitive research data is securely stored on its own infrastructure. Which cloud model should the university consider to meet these mixed requirements?
Answer: D. Explanation: Hybrid Cloud (D) is most appropriate, as it allows the university to store sensitive research data on-premises while using cloud resources to provide students with remote access to course materials. This model provides the necessary balance between security and accessibility, as sensitive data remains local, while the cloud’s scalability offers students flexible access. (A) Public Cloud would enable access to resources but would not meet the requirement to store sensitive research data securely on-campus. (B) Private Cloud could securely store sensitive data but would be cost-inefficient and complex for remote student access. (C) Community Cloud might serve shared academic needs but does not address the mixed storage and access needs as directly as Hybrid Cloud.
21. A SaaS provider uses Azure to host its services and needs to allocate separate environments for development, testing, and production, each with independent billing and access controls. The provider wants to ensure that the resources in these environments are kept isolated. How should they structure their Azure subscriptions to meet these requirements?
Correct Answer: B Explanation: Creating separate subscriptions for development, testing, and production environments (B) provides isolated billing and access controls, ensuring that each environment remains fully independent and secure. This setup allows the SaaS provider to allocate specific budgets and apply distinct access policies to each environment, facilitating better management and compliance for production workloads. Option A (single subscription with resource groups) does not provide billing separation, and Option C (different Availability Zones) does not address isolation at the billing and management level. Option D (using tags for access control) lacks the isolation required for secure, independent environments, making B the optimal choice.
22. A large enterprise manages multiple applications on Azure that rely on NoSQL data storage for quick access to structured key-value data. They require a storage type that provides a highly available and scalable solution for rapid data retrieval based on unique identifiers. Which Azure storage option would best meet these requirements?
Answer: B. Azure Table Storage Explanation: Azure Table Storage (B) is specifically designed for NoSQL storage, providing a highly available and scalable environment for key-value pairs and structured data, with low latency for fast retrieval based on unique identifiers, which suits the enterprise’s need for quick access. Blob Storage (A) is optimized for storing large amounts of unstructured data, which is unsuitable for structured NoSQL data. Disk Storage (C) is meant for block-level storage typically associated with virtual machines, lacking the scalability and quick retrieval needed for NoSQL workloads. File Storage (D) is used for shared file access and does not support the rapid access and NoSQL structure requirements.
23. An online education platform offers live streaming for remote classes. Class attendance fluctuates, with peak usage during exam periods and much lower activity in between. How would the consumption-based model on Azure help the platform manage costs effectively?
Answer: C. Explanation: The consumption-based model enables the platform to scale resources based on actual attendance and charges only for resources used (C), making it the most cost-effective option for varying attendance levels. This approach adjusts infrastructure usage based on real-time demand, avoiding unnecessary costs during off-peak times. (A) A subscription fee based on estimates could lead to overpayment or insufficient resources. (B) Reserving resources would not match the irregular peaks and valleys in demand. (D) Requiring a set level of resources would ensure availability but would be inefficient in managing costs given fluctuating demand.
24. A company notices that some of its VMs on Azure are underutilized and could be resized or stopped to save costs. Which feature of Azure Advisor should they rely on to identify such opportunities regularly?
Answer: C Explanation: The Cost tab in Azure Advisor (C) provides targeted recommendations for identifying underutilized resources, such as resizing or stopping VMs to optimize spending. While the Security tab (A) ensures security best practices, it doesn’t address cost management. The Performance tab (B) focuses on application responsiveness, not cost reduction. Operational Excellence (D) pertains to deployment efficiency, unrelated to ongoing cost-saving insights.
25. A startup is planning its first cloud deployment and wants to estimate the costs of various Azure services they intend to use, including virtual machines, databases, and storage. They have no existing infrastructure and need a straightforward approach to estimate ongoing monthly costs. Which Azure tool should they use to obtain the most accurate pricing for their new deployment?
Correct Answer: B Explanation: The Azure Pricing Calculator (B) is ideal for this scenario as it allows the startup to select Azure services, configure their specifications, and generate an estimated monthly cost. Since the company has no on-premises infrastructure, the TCO Calculator (A) would not be relevant, as it is intended to compare on-premises costs with Azure to assess potential savings. Azure Migrate (C) assists with planning migrations, including cost assessments, but is more applicable to companies moving existing workloads to Azure rather than a new deployment. Azure Advisor (D) provides best practice recommendations for cost optimization but does not offer a tool for estimating initial deployment costs.
26. A multinational corporation needs to enable secure and isolated communication between Azure resources and its on-premises data centers. The company wants to ensure a stable, private connection for its applications, even during peak internet usage, as it handles sensitive data that should not traverse the public internet. Which networking solution should they implement?
Answer: C. ExpressRoute Explanation: ExpressRoute (C) provides a dedicated, private connection between Azure and on-premises data centers, bypassing the public internet entirely, making it ideal for secure and stable connectivity, especially during peak times. This private link is crucial for handling sensitive data without exposure to internet fluctuations. Site-to-Site VPN (A) can connect on-premises networks to Azure but relies on the internet, which may not provide consistent performance during peak usage. Azure Application Gateway (B) is a load balancing solution for web traffic but does not connect on-premises networks with Azure resources. Azure Load Balancer with private IPs (D) can distribute traffic within Azure but does not facilitate secure connections to on-premises networks.
27. A government agency requires that their Azure resources comply with specific security policies and budgets. They have different departments that need to deploy and manage their own resources, but they also need centralized control over policies and spending limits. What Azure structure should they implement to meet these requirements?
Correct Answer: C Explanation: Using Azure Management Groups with resource groups nested within each department's subscription (C) enables centralized management of policies, budgets, and compliance across multiple departments, while still allowing each department to manage its own resources. This structure provides high flexibility and control over policy enforcement and cost management at various levels. Option A (a single resource group) would limit flexibility and make policy enforcement difficult. Option B (separate resource groups) does not allow for centralized policy management across departments, and Option D (a single subscription with manual policy management) lacks scalability and may lead to inconsistent policy application, making C the most effective setup.
28. A healthcare organization using Azure needs to ensure that medical staff can access different clinical applications through a single set of credentials, reducing login delays during critical times. They also need to maintain strong access controls for compliance. Which Azure feature can best facilitate fast and secure access to multiple applications with a single login?
Answer: A. Single Sign-On (SSO) with Conditional Access (A) best addresses the healthcare organization’s needs by providing unified login access to multiple applications while ensuring secure access through policies that enforce conditional access, supporting compliance and security. Azure AD Privileged Identity Management (PIM) (B) manages access for privileged roles but does not reduce login delays across multiple applications. Pass-through Authentication (C) allows direct authentication against on-premises directories but does not provide single-sign-on access across multiple applications. Role-Based Access Control (RBAC) (D) controls access permissions but does not simplify authentication processes or reduce login instances.
29. A cybersecurity analyst is tasked with securing the login process for the organization’s Azure portal. They need a solution that integrates with existing directory services to enforce secure login practices, including password policies and conditional access. What feature of Azure Active Directory can best accomplish this?
Answer: B. Conditional Access (B) in Azure AD allows the analyst to create policies that apply specific login conditions, such as requiring Multi-Factor Authentication (MFA) based on risk factors like location or device compliance. This feature also integrates with the organization's existing directory services to enforce secure login practices. Azure AD Identity Protection (A) focuses on identifying risks and suspicious login attempts but does not enforce specific conditions. Azure AD Connect (C) is more suited to synchronizing on-premises directories with Azure AD rather than applying conditional access policies. Privileged Identity Management (PIM) (D) allows for time-based and approval-based role activation, but it does not directly enforce login conditions for all users.
30. A software development company plans to deploy its application across multiple virtual machines within the same Azure datacenter. They want to avoid downtime if a particular server rack in the datacenter fails. They also need protection from simultaneous Azure maintenance updates impacting all VMs at once. What is the best way to configure their virtual machines to meet these requirements?
Answer: B. Deploy the VMs in an Availability Set Explanation: Deploying VMs in an Availability Set (B) ensures that they are distributed across multiple fault and update domains, providing resilience against hardware failure on a specific server rack and minimizing the impact of Azure maintenance updates. A single fault domain (A) increases vulnerability to hardware failures since all VMs are in one domain. Spot Virtual Machines (C) offer cost benefits but are unsuitable for applications requiring high availability due to their susceptibility to deallocation. Proximity Placement Groups (D) optimize for low latency but do not provide redundancy across fault and update domains.
31. A team in an organization has been tasked with minimizing their Azure cost by using specific SKUs for virtual machines (VMs). However, team members occasionally deploy unsupported VM SKUs, resulting in higher-than-expected charges. Which feature within Azure can ensure that only approved VM SKUs are used?
Correct Answer: B Explanation: Azure Policy (B) is designed to enforce specific configurations and standards, including restricting resource types like VM SKUs. By applying a policy to limit VM SKUs, the team can ensure only cost-effective, approved SKUs are available for deployment, thus controlling costs. Azure Cost Management (A) is a tool for tracking and analyzing costs but does not prevent non-compliant SKUs from being deployed. Azure Monitor (C) is used to gather and analyze telemetry data, not for enforcing specific configurations. Azure Advisor (D) provides optimization recommendations but cannot enforce the use of specific VM SKUs.
32. A manufacturing company needs to deploy Azure Virtual Desktop to allow engineers to work remotely on sensitive projects. To maintain data security, they want to limit access to Azure Virtual Desktop from specific geographic regions and ensure compliance with company policies. What feature can they configure to meet these security requirements?
Answer: C. Conditional Access with location-based policies Explanation: Conditional Access with location-based policies (C) allows the manufacturing company to restrict Azure Virtual Desktop access to specific geographic locations, helping ensure that only users within approved regions can connect. This feature supports data security and compliance by enforcing access policies based on location. Azure Policy (A) can help enforce compliance at the resource level but does not control user access based on location. Network Security Groups (B) provide network-level controls but do not restrict access by geographic location without Conditional Access. Azure Resource Locks (D) prevent changes to resources but do not affect user access or location-based restrictions.
33. A logistics company needs to perform a test migration to Azure for one of their on-premises servers to evaluate performance and application compatibility before a full migration. Which feature in Azure Migrate would allow them to perform this without affecting their production environment?
Answer: B. Server Migration with Test Migration Explanation: The Server Migration with Test Migration (B) feature in Azure Migrate allows the logistics company to perform a test migration of a server, enabling them to evaluate performance and compatibility in Azure without impacting their production environment. This feature replicates the server to Azure in a test environment, allowing for troubleshooting and verification before a full migration. Dependency Mapping (A) identifies dependencies but does not perform test migrations. Database Migration (C) is used for migrating databases rather than servers. Cost Estimation (D) provides financial projections but does not facilitate testing in Azure.
34. A company has observed significant monthly fluctuations in their Azure costs and wishes to understand the underlying causes. They need a detailed view of costs associated with each resource and resource group. Which Azure feature provides the best insights to analyze these cost variations?
Correct Answer: A Explanation: Azure Cost Analysis (A) offers a detailed breakdown of costs by resource, resource group, and other parameters, allowing the company to analyze cost variations effectively. This feature helps pinpoint which resources are driving costs and understand usage patterns, making it ideal for analyzing monthly fluctuations. The Azure Pricing Calculator (B) is used to estimate future Azure costs but does not offer insights into actual usage patterns or cost breakdowns. Azure Policy Compliance Dashboard (C) helps enforce policies but does not provide cost insights. Azure Service Health (D) informs about service-related issues but does not assist in analyzing or understanding cost patterns.
35. A government organization requires its Azure-based infrastructure to comply with regulatory standards that mandate continuous monitoring and assessment of security risks. They also need a detailed view of their security posture to prevent data breaches and unauthorized access. Which Azure service should the organization implement to meet these requirements?
Correct Answer: B Explanation: Option (B) is correct because Azure Security Center provides continuous monitoring, security assessments, and a comprehensive view of security posture, which is critical for a government organization needing to comply with regulatory standards. Security Center also provides insights into configuration weaknesses, recommendations for compliance, and alerts on suspicious activities, aligning with the organization’s need for stringent governance. Option (A) is incorrect as Azure Monitor offers infrastructure-level insights but lacks detailed security assessments and compliance monitoring. Option (C) is incorrect because Conditional Access focuses on user and device access control but does not cover overall security posture or continuous assessments. Option (D) is incorrect as Azure Policy enforces specific configurations but does not provide an in-depth view of security posture or threat monitoring, making it insufficient for full regulatory compliance.
36. An organization has configured a Site-to-Site VPN between their on-premises network and Azure using Azure VPN Gateway. They now want to optimize routing to avoid asymmetric routing issues, particularly as they plan to add more VNets. Which protocol should they enable on Azure VPN Gateway to improve routing efficiency and avoid routing conflicts?
Answer: A. Border Gateway Protocol (BGP) Explanation: Enabling Border Gateway Protocol (BGP) (A) on Azure VPN Gateway enhances routing efficiency and prevents asymmetric routing issues by dynamically managing routing paths and avoiding conflicts. BGP is especially useful when connecting multiple VNets or on-premises sites, as it allows automatic route updates and better routing control. Internet Protocol Security (IPsec) (B) provides encryption for VPN tunnels but does not manage routing. Dynamic Host Configuration Protocol (DHCP) (C) assigns IP addresses but is unrelated to routing optimization. Transmission Control Protocol (TCP) (D) is a transport layer protocol and does not manage network routing.
37. A development team has large datasets in an Azure Blob Storage account in one region and needs to transfer these datasets to another Azure Blob Storage account in a different region for data replication purposes. They are looking for a secure and efficient way to move this data directly between storage accounts without bringing it on-premises. Which tool would best meet their needs?
Answer: B. AzCopy Explanation: AzCopy (B) supports direct data transfer between Azure Blob Storage accounts, including cross-region transfers, which makes it the ideal choice for securely moving data between storage accounts without the need for local downloads. This tool allows for efficient data replication across regions and includes options for secure data transfer. Azure Data Box (A) is a physical device and is not suitable for cloud-to-cloud transfers. Azure Storage Explorer (C) provides an interface for managing storage but is not optimized for large-scale, direct transfers between Blob Storage accounts. Azure File Sync (D) is designed for syncing on-premises file servers with Azure Files, not for transferring data directly between Blob Storage accounts.
38. A DevOps engineer needs to check the status and configuration of a Virtual Machine (VM) in Azure to verify if it's running and meets the necessary specifications. They want to use Azure PowerShell for this verification. Which command should they execute to retrieve detailed information about the VM’s state and configuration?
Correct Answer: A Explanation: The correct answer is A. Get-AzVM allows the engineer to retrieve comprehensive details about a virtual machine, including its current state (running, stopped, etc.) and its configuration. This cmdlet is essential for checking both status and specifications in Azure PowerShell. Check-AzVMStatus (B), Verify-AzVM (C), and Start-AzVMStatus (D) are not actual Azure PowerShell cmdlets, making them incorrect for this purpose. Get-AzVM provides a straightforward method for validating VM details as part of routine management tasks.
39. A cloud administrator wants to improve cost control within their organization by using tags to identify the purpose and ownership of each resource. They also need to ensure that untagged resources are automatically identified and assigned a default tag to avoid gaps in cost reporting. Which Azure service provides this level of control?
Correct Answer: A Explanation: Azure Policy (A) provides the capability to enforce tagging standards by automatically applying default tags to resources or identifying untagged resources, which aids in consistent cost tracking and control. By setting up a policy to add a default tag (e.g., “Owner: Unknown”) to any untagged resource, the cloud administrator ensures that every resource is categorized, allowing for accurate cost reporting. Azure Security Center (B) focuses on security, not cost management or tagging. Azure Cost Analysis (C) enables spending review but does not enforce tagging practices. Azure Monitor (D) provides resource monitoring and alerting but does not manage or apply tags.
40. A government agency is considering using Azure to host a large-scale data analysis platform. Given the nature of their work, they are required to adhere to strict security standards and maintain control over access to sensitive information. If the agency opts for a hybrid setup involving both on-premises infrastructure and Azure’s Platform as a Service (PaaS), which security responsibility would the agency retain?
Answer: B. Explanation: In a PaaS environment, configuring encryption for data within the application (B) is a responsibility retained by the customer, in this case, the government agency. This includes ensuring that data is securely encrypted, whether at rest or in transit, to meet regulatory requirements. Physical hardware security (A) and virtualization security (C) are managed by Azure, as is the security of the data center’s network infrastructure (D). The agency’s focus should be on securing data within their application, applying proper encryption and access controls to protect sensitive information from unauthorized access.
41. A healthcare organization is using Azure DNS to manage its application’s DNS records. For compliance reasons, they need to restrict the ability to modify DNS records to specific administrators. Which Azure feature should they implement to ensure only authorized personnel can manage DNS records?
Answer: A. Azure Role-Based Access Control (RBAC) Explanation: Azure Role-Based Access Control (RBAC) (A) enables the healthcare organization to assign specific roles to administrators, controlling who can manage DNS records within Azure DNS. By using RBAC, they can restrict DNS modification permissions to authorized users only, ensuring compliance with security policies. Network Security Groups (NSGs) (B) control network traffic but do not provide access management for DNS resources. Azure Active Directory B2C (C) is for managing customer identity and access, not internal role-based permissions. Azure Multi-Factor Authentication (D) enhances security but does not control permissions directly for Azure DNS.
42. A multinational corporation with critical operations in Brazil seeks to leverage Azure’s paired region structure for enhanced disaster recovery. They need to ensure that their data remains within Latin America in case of failover. Which Azure region pair best satisfies this requirement?
Correct Answer: D Explanation: Brazil South and Brazil Southeast (D) are the most suitable Azure region pair for the company’s needs, providing disaster recovery within Latin America and ensuring data residency compliance. By choosing a Brazil-based region pair, the company can ensure that data remains within Latin America, addressing both compliance and redundancy requirements. Brazil South paired with East US (A), South Central US (B), or Central US (C) would lead to cross-border data replication, violating residency rules. Brazil South and Brazil Southeast provide a secure, compliant option for failover within the same continent.
43. An e-commerce company is hosting a web application on Azure and needs the application to be accessible to users worldwide over the internet. They also require that specific backend services, like their database, remain secure within their virtual network without public exposure. Which configuration best meets these requirements?
Answer: C. Public Endpoint for the application and Private Endpoint for the database Explanation: A Public Endpoint for the application and a Private Endpoint for the database (C) allow the e-commerce company to make the web application accessible to users over the internet while keeping the database secure within the virtual network. This configuration ensures the application is accessible globally, while the database is protected from public exposure, maintaining security for backend services. Public Endpoints for all resources (A) would expose the database unnecessarily. Private Endpoints for both (B) would make the application inaccessible to internet users. ExpressRoute (D) is a private connection for on-premises networks and is not suitable for public access to the application.
44. A financial services company must maintain high availability for its Azure-hosted applications. To proactively respond to any potential Azure service disruptions, what is the best approach for the company to monitor and be notified of issues that may impact their service availability?
Answer: A Explanation: The financial services company should enable Service Health alerts (A) to receive real-time notifications on any Azure service outages or planned maintenance that could affect availability. This proactive approach allows them to take preventive actions or inform stakeholders as needed. Option (B), Resource Health, provides useful insights for individual resources but does not send real-time alerts for broader Azure issues. Option (C), Azure Monitor metric alerts, focus on performance metrics rather than service disruptions. Option (D), Cost Management, is unrelated to monitoring service availability or health.
45. An analytics company needs a compute solution to perform large-scale batch processing on extensive datasets during off-peak hours. The company is looking for an efficient way to manage resources and reduce costs, as the compute requirements will only be needed temporarily each day. Which Azure service should they consider?
Answer: C. Azure Batch Explanation: Azure Batch (C) is specifically designed for large-scale batch processing and is cost-effective for scheduled, resource-intensive tasks that don’t require constant runtime, aligning with the company's off-peak processing needs. Azure Batch efficiently manages resource allocation and supports job scheduling, which is ideal for temporary, high-volume compute tasks. Azure Kubernetes Service (A) supports containerized workloads but is generally optimized for persistent or dynamically scaled microservices rather than scheduled batch processing. Azure Functions (B) operate on an event-driven model and are not ideal for high-throughput, multi-step batch jobs. Azure Virtual Machines (D) could fulfill the need but would incur costs even during idle times, making it a less cost-effective option than Azure Batch.
46. A consulting firm has implemented passwordless authentication in Azure using Microsoft Authenticator, but they are experiencing issues with users who frequently travel to areas with limited internet access. To improve reliability, the firm wants to ensure that users can authenticate even offline. What solution would address this requirement while maintaining passwordless authentication?
Answer: A. Microsoft Authenticator app with code generation (A) is the best solution, as it allows users to generate verification codes directly in the app without needing an internet connection, maintaining a passwordless experience. SMS-based verification (B) and phone call verification (C) both require connectivity, which would not resolve offline access issues. FIDO2 security keys (D) are another option, but they require physical keys, which may not be as convenient for travelers as the Microsoft Authenticator app with offline code generation.
47. A government agency is building a secure infrastructure in Azure, intending to prevent attacks through multiple layers, including identity, application, and network. As part of the defense-in-depth model, they are implementing RBAC, Conditional Access, and Web Application Firewalls (WAF). How does this multi-layered approach align with defense-in-depth principles?
Answer: A. By securing the identity, application, and network layers independently, creating a redundant protection strategy (A), the agency is aligning with the defense-in-depth model, which relies on multiple independent layers of security to create redundancy, ensuring that if one layer is breached, other layers continue to provide protection. Centralizing security measures into one layer (B) contradicts defense-in-depth, which is inherently layered. Minimizing authentication requirements (C) reduces security, while assigning the same level of access (D) ignores the principle of layered, least-privilege access.
48. A software company needs to implement a message-processing system to coordinate tasks between distributed application components hosted on Azure. Each component needs a mechanism to store and retrieve messages reliably in a queue format to ensure message delivery even if a system component temporarily goes offline. Which Azure Storage service is most suitable for this purpose?
Answer: B. Azure Queue Storage Explanation: Azure Queue Storage (B) is specifically designed for scenarios where distributed application components need to communicate asynchronously through messages. It enables reliable message storage and delivery, allowing components to retrieve messages as they become available, which is essential for inter-component coordination in distributed applications. Blob Storage (A) is intended for unstructured data storage, not message queuing. File Storage (C) provides managed file shares for shared access rather than queuing, and Table Storage (D) is optimized for fast key-value pair access in NoSQL scenarios, not for managing message queues between application components.
49. A public sector organization in the United Kingdom plans to deploy a cloud solution on Azure to support confidential citizen data. The organization requires the solution to adhere to UK-specific data sovereignty requirements and operate in an isolated, compliant environment tailored to government standards. Which Azure region should they select to meet these needs?
Correct Answer: C Explanation: UK Government South (C) is a region within the Azure UK Government cloud, designed to meet UK-specific data sovereignty, privacy, and compliance requirements. This region provides an isolated and highly secure environment, suitable for handling confidential government data, making it the best choice for this public sector organization. UK South (A) and North Europe (B) are standard Azure regions and do not provide the level of isolation or compliance required for UK government workloads. East US (D) is in the United States and is not compliant with UK data residency requirements, making UK Government South the optimal solution for handling sensitive citizen data securely within UK borders.
50. A project manager wants to prevent accidental deletions of a set of development resources in Azure while still allowing the development team to modify and update configurations as needed. Which Azure lock type should be applied to these resources to support these specific needs?
Correct Answer: C Explanation: A Delete Lock (C) is the best solution here, as it prevents accidental deletions but still allows modifications, enabling the development team to continue making necessary changes without risking resource deletion. A Read-Only Lock (A) would restrict modifications as well, which is more restrictive than necessary. Access Control Lock (B) and Write Lock (D) are not actual lock types in Azure and do not apply in this scenario.
51. A financial institution relies on a critical database in Azure and wants to be notified of any resource failures or connectivity issues. They are looking for an alert that provides specific information about the health of this database, including any potential downtimes. What is the best type of alert to configure in Azure Monitor for this scenario?
Answer: B Explanation: The financial institution should configure a Resource Health Alert (B), which provides real-time status updates on the health of specific resources, such as databases, alerting the team if there are downtimes or other issues. Resource Health Alerts are designed for monitoring the operational health of individual resources. Option (A), Metric Alerts, track specific performance metrics like CPU and memory but do not indicate overall resource health. Option (C), Service Health Alerts, provide information on broader Azure service disruptions but lack resource-specific insights. Option (D), Activity Log Alerts, monitor configuration changes but don’t address resource connectivity or health.
52. A manufacturing company needs to deploy a fully functional Enterprise Resource Planning (ERP) system to manage its inventory, accounting, and human resources across multiple factories. They want a solution that includes centralized updates, secure access, and minimal IT involvement in maintenance and configuration. Which cloud service model in Azure should they adopt?
Answer: C. Software as a Service (SaaS) Explanation: Software as a Service (SaaS) (C) is the best model for the ERP requirements of the manufacturing company, as SaaS provides a fully functional ERP solution with centralized updates and secure access while eliminating the need for IT management of the underlying systems. SaaS ERP solutions are designed for organizational needs like inventory, HR, and accounting with minimal configuration, making it ideal for this scenario. Platform as a Service (PaaS) (A) supports app development rather than fully managed business applications. Infrastructure as a Service (IaaS) (B) and Virtual Machines (D) involve managing infrastructure, which is unnecessary for the company’s requirements of minimal IT involvement.
53. An engineer needs to monitor the performance metrics of a specific Azure Virtual Machine (VM) using Azure CLI. They want to retrieve the CPU utilization over a specified time range to analyze usage patterns. Which Azure CLI command would best achieve this goal?
Correct Answer: C Explanation: The correct answer is C. “az monitor metrics list” with the --metric flag allows users to retrieve specific performance metrics for a resource, such as CPU utilization, over a specified interval. This command is suitable for monitoring VMs as well as other Azure resources. While “az vm monitor metrics list” (A) is not an actual CLI command, “az vm show” (B) retrieves the general details of a VM but not specific performance metrics. Similarly, “az resource list” (D) lists resources but does not offer detailed performance metrics like CPU utilization.
54. An engineering firm needs a disaster recovery plan that includes Azure File Sync to replicate their on-premises files to Azure. If their on-premises server goes down, they want to continue accessing these files from another server in a different office location. Which configuration would ensure that all files are available on both servers and in Azure?
Answer: C. Register both servers with the same sync group Explanation: By registering both servers with the same sync group (C), Azure File Sync will keep all data synchronized across both servers and Azure, enabling access from either server. If one server goes down, the other can still provide access to the files, creating a reliable disaster recovery setup. Enabling Cloud Tiering (A) helps optimize storage but does not ensure full data replication across servers. Setting up a second sync group (B) would separate file synchronization between servers, reducing data consistency. Geo-redundancy (D) is not a feature of Azure File Sync itself, though Azure Files in the background may have redundancy options.
55. A multinational enterprise wants to deploy applications in multiple Azure regions and requires a private, secure, and low-latency connection between virtual networks in different regions. Which Azure feature should they use to establish these connections across regions?
Answer: B. Virtual Network Peering Explanation: Virtual Network Peering (B) allows the enterprise to connect VNets across different Azure regions securely and with low latency, using Azure’s backbone network. VNet Peering enables seamless communication between VNets without exposing traffic to the public internet. VPN Gateway with Site-to-Site VPN (A) connects on-premises networks with Azure but is not as efficient for inter-region VNet communication. Application Gateway with internal load balancing (C) distributes web traffic but does not provide VNet connectivity. Public IP addresses (D) would expose each VNet to the internet, which is unsuitable for a secure, private network configuration.
56. A healthcare organization needs to secure access to patient data stored in Microsoft Entra ID-integrated applications. They want to detect and respond to potentially risky sign-ins automatically. Which feature of Microsoft Entra ID provides real-time risk detection and offers automated responses to high-risk activities?
Answer: B. Microsoft Entra Identity Protection (B) is the best choice here, as it provides real-time risk detection for sign-ins, flags potentially suspicious activities, and enables automated responses, such as requiring Multi-Factor Authentication for high-risk sign-ins. Microsoft Entra Identity Governance (A) focuses on access governance rather than on real-time risk detection. Privileged Identity Management (PIM) (C) manages privileged roles but does not focus on automated threat responses. Application Proxy (D) facilitates secure remote access to on-premises apps but does not offer real-time risk detection.
57. A financial firm wants to ensure its applications in Azure can handle failover scenarios across multiple regions to achieve optimal disaster recovery. Due to sensitive financial data, they also need to maintain data residency within Europe. Which pair of Azure regions would be the most appropriate for deploying this disaster recovery setup while meeting the regulatory requirements?
Correct Answer: B Explanation: North Europe and West Europe (B) are the ideal region pair in this scenario as they both reside within Europe, maintaining data residency compliance crucial to financial institutions dealing with sensitive data. By selecting a region pair within the same geopolitical area, the firm not only benefits from lower latency between primary and backup sites but also adheres to the regulatory requirements for keeping financial data within Europe. East US and West Europe (A) would violate data residency due to the cross-continent pairing, and Southeast Asia and East Asia (C) are outside of Europe, thus inappropriate. West US and Central US (D) are both within the US, making them non-compliant with European data residency laws.
58. A startup wants to analyze how users are interacting with their application on Azure, including which features are accessed most frequently and where users are experiencing errors. They plan to use Application Insights for this analysis but need guidance on which feature would give them the best insights into user behavior and error patterns. Which feature should they focus on?
Answer: A Explanation: The startup should focus on the User Flows feature (A) in Application Insights, which helps visualize the paths users take, identifying popular features and drop-off points, making it ideal for understanding user behavior and pinpointing error-prone areas. Option (B), Metric Explorer, provides broad metrics analysis but lacks user-specific insights. Option (C), Availability Tests, are used for uptime testing but not for tracking actual user interactions. Option (D), Dependency Tracking, is designed for monitoring service interactions within the application, not user behavior.
59. A technology startup has recently implemented Microsoft Defender for Cloud to protect their Azure resources. They notice that Defender for Cloud offers automatic protection and vulnerability assessments. Which of the following best describes the primary purpose of using Microsoft Defender for Cloud in this scenario?
Answer: A. The primary purpose of Microsoft Defender for Cloud (A) in this scenario is to provide proactive threat detection, automatic security recommendations, and vulnerability assessments, which are essential for safeguarding the startup’s resources in Azure. Synchronizing identities (B) is managed by Azure AD Connect, not Microsoft Defender for Cloud. Centralized billing and resource monitoring (C) are not the core functions of Defender for Cloud. Enforcing network routing policies with Network Security Groups (D) is a separate function that does not involve proactive threat detection or vulnerability assessments.
60. A European retail company plans to launch its online shopping platform on Azure and wants to ensure that customer data remains compliant with EU data residency requirements. How can Azure datacenters help the company meet these compliance needs?
Correct Answer: B Explanation: Using Azure datacenters located within the European Union (B) ensures that customer data is stored and processed within EU borders, which is essential for meeting EU data residency compliance. Azure offers several regions within the EU, allowing organizations to meet regulatory requirements while also accessing services that are geographically closer to end users. Option A (deploying resources in any datacenter) does not guarantee compliance with EU data residency rules, and Option C (datacenter in the United States) would violate EU data residency requirements by storing data outside of the EU. Option D (Geo-redundant Storage) does not ensure data is kept within the EU unless both regions are in the EU, making B the best choice.
61. A startup specializing in AI modeling stores large training datasets in Azure Blob Storage. During the model training phase, these datasets are accessed frequently, but between phases, access drops significantly. What storage strategy should they use to manage storage costs effectively without affecting performance during the training phase?
Answer: C. Move datasets between Hot and Cool Tiers as access frequency changes Explanation: Moving datasets between the Hot and Cool Tiers (C) based on access frequency enables the startup to balance performance and cost. During training, datasets are stored in the Hot Tier to ensure high-performance access, and when access decreases between training phases, the data can be moved to the Cool Tier to lower costs without impacting accessibility. Storing data solely in the Archive Tier (A) would significantly delay retrieval times, unsuitable for frequent access during training phases. Storing data in the Cool Tier only (B) would not provide the performance required during active training. The Premium Tier (D) is designed for high transactional requirements, which is unnecessary for dataset storage and cost-inefficient for this purpose.
62. A multinational corporation with a complex IT infrastructure wants to leverage Azure's governance capabilities to manage a mix of resources in different regions, including non-Azure clouds and on-premises servers. The goal is to apply consistent role-based access controls, tagging, and policies across all resources from a single interface. Which solution is best suited to meet this requirement?
Correct Answer: C Explanation: The correct answer is C. Azure Arc enables the extension of Azure’s governance and management capabilities, such as role-based access control, tagging, and policy enforcement, to non-Azure and on-premises resources, making it possible for the corporation to achieve consistency across hybrid and multi-cloud environments. Azure Active Directory (A) manages identity but does not provide comprehensive governance over hybrid infrastructure. Azure Blueprints (B) supports deployment of repeatable environments but is limited to Azure resources. Deploying Azure Virtual Machines in each region (D) would not meet the goal of managing a diverse hybrid environment consistently.
63. A financial analytics firm is developing a complex application that requires various microservices working in tandem to provide real-time analysis. They need an environment that supports easy deployment, scalability, and integration of services with minimal infrastructure overhead. The team prefers to avoid server maintenance and focus on developing application logic. Which cloud service model should the firm implement?
Answer: C. Platform as a Service (PaaS) Explanation: Platform as a Service (PaaS) (C) is suitable here as it supports microservice deployment with built-in scalability and integration options. PaaS minimizes the need for infrastructure management, allowing the firm to focus on the application logic for real-time analytics. IaaS (A) requires significant management of servers and infrastructure, adding overhead. Serverless Computing (B) is more suitable for individual functions rather than complex applications with multiple interdependent services. SaaS (D) provides fully managed software, which would limit the firm’s ability to customize the application to meet its analytics requirements.
64. A financial services company uses a core application on Microsoft Azure that processes sensitive transactions 24/7. To maintain strict uptime requirements and minimize disruptions, they want to ensure that the application can withstand regional outages and manage variable workloads effectively. Which approach would best meet these high availability and scalability needs?
Correct Answer: B Explanation: Option (B) is correct because deploying the application across multiple regions with Azure Traffic Manager provides both regional redundancy and global load balancing, which is essential for high availability and scalability, especially during regional failures. Azure Traffic Manager routes traffic to healthy instances, ensuring continuous service even if one region goes down. Option (A) is incorrect because while Availability Zones within a single region improve local redundancy, they do not protect against regional outages. Option (C) is incorrect because Azure Site Recovery is designed for disaster recovery, not real-time high availability or scalability. Option (D) is incorrect because premium storage may improve performance for a single VM but does not address scalability or multi-region high availability.
65. A healthcare organization needs to host a microservices-based application in a way that isolates each service within its environment. They also need flexibility to deploy and scale individual components independently to optimize resource usage. Which Azure service is best suited for this architecture?
Answer: D. Azure Kubernetes Service (AKS) Explanation: Azure Kubernetes Service (D) is specifically designed for managing and orchestrating containers, making it the optimal solution for hosting a microservices-based application where each service operates in isolation within its container. AKS supports independent scaling and deployment of each microservice, providing flexibility and efficient resource utilization. Azure App Service (A) does not provide the same level of control over container orchestration and isolation as AKS. Hosting each service on separate Azure Virtual Machines (B) would be costly and more challenging to manage for a microservices architecture. Azure Functions (C) are suited for event-driven, stateless tasks rather than managing complex microservices applications with interdependencies.
Your score is
Restart Exam