CompTIA Cloud+ Study Guide. Ben Piper

Читать онлайн.
Название CompTIA Cloud+ Study Guide
Автор произведения Ben Piper
Жанр Зарубежная компьютерная литература
Серия
Издательство Зарубежная компьютерная литература
Год выпуска 0
isbn 9781119810957



Скачать книгу

reference to determine cloud capacity increases and decreases.

      85 C. The Domain Name System records need to be changed to reflect the new IP address mapped to the domain name.

      86 C. Databases read and write requests utilize storage I/O and should be the focus for troubleshooting.

      87 C. Elasticity allows for cloud services to expand and contract based on actual usage and would be applicable to increasing storage capacity.

      88 C. Workflow applications track a process from start to finish and sequence the applications that are required to complete the process.

      89 A, C, D. Resources such as the amount of RAM needed, CPU cycles, and storage capacity are common systems that may become saturated as your cloud compute requirements grow.

      90 B, C. In addition to the web servers, IP addresses may be required for the DNS server and the default gateway.

      91 B. The question is asking about being able to access a specific cloud service. This would concern Jill having the authorization to access the storage volume. Authentication and SSO are login systems and not rights to services. A federation links user databases.

      92 A. The tracert and traceroute commands are useful for network path troubleshooting. These commands show the routed path a packet of data takes from source to destination. You can use them to determine whether routing is working as expected or whether there is a route failure in the path. The other options are all incorrect because they do not provide network path data.

      93 B, D. The Windows command-line utility nslookup resolves domain names to IP addressing. The Linux equivalent is the dig command. The other options are not valid for the solution required in the question.

      94 B. The Windows Remote Desktop Protocol allows for remote connections to a Windows graphical user desktop.

      95 C. The tcpdump utility allows a Linux system to capture live network traffic, and it is useful in monitoring and troubleshooting. Think of tcpdump as a command-line network analyzer. The dig and nslookup commands show DNS resolution but do not display the actual packets going across the wire. netstat shows connection information and is not DNS-related.

      96 E. In a data center, terminal servers are deployed and have several serial ports, each cabled to a console port on a device that is being managed. This allows you to make an SSH or a Telnet connection to the terminal server and then use the serial interfaces to access the console ports on the devices to which you want to connect. The other options do not provide serial port connections.

      97 C. Infrastructure security is the hardening of the facility and includes the steps outlined in the question, including nondescript facilities, video surveillance, and biometric access.

      98 C, E. Common remote access tools include RDP, SSH, and terminal servers. IDSs/IPSs are for intrusion detection, and DNS is for domain name–to–IP address mappings and is not a utility for remote access.

      99 C. A secure Internet-based connection would be a VPN.

      100 A. Logging into systems is referred to as authentication. Also, the question references multifactor authentication (MFA) as part of the system.

       THE FOLLOWING COMPTIA CLOUD+ EXAM OBJECTIVES ARE COVERED IN THIS CHAPTER:

       1.1 Compare and contrast the different types of cloud models.Deployment modelsPublicPrivateHybridCommunityCloud within a cloudMulticloudMultitenancyService modelsInfrastructure as a service (IaaS)Platform as a service (PaaS)Software as a service (SaaS)Advanced cloud servicesInternet of Things (IoT)ServerlessMachine learning/Artificial intelligence (AI)Shared responsibility model

       1.3 Explain the importance of high availability and scaling in cloud environments.HypervisorsAffinityAnti-affinityRegions and zonesHigh availability of network functionsSwitchesRoutersLoad balancersFirewallsScalabilityAuto-scaling

       1.4 Given a scenario, analyze the solution design in support of the business requirements.EnvironmentsDevelopmentQuality assurance (QA)StagingProductionTesting techniquesVulnerability testingPenetration testingPerformance testingRegression testingFunctional testingUsability testing

       3.1 Given a scenario, integrate components into a cloud solution.ApplicationServerless

       4.1 Given a scenario, configure logging, monitoring, and alerting to maintain operational status.MonitoringBaselinesThresholds

       4.3 Given a scenario, optimize cloud environments.PlacementGeographicalCluster placementRedundancyColocation

       4.4 Given a scenario, apply proper automation and orchestration techniques.Automation activitiesRoutine operationsUpdatesScaling

      Let's start by briefly looking at where cloud sits in the broader scope of the IT world. Before cloud computing, organizations had to acquire the IT infrastructure needed to run their applications. Such infrastructure included servers, storage arrays, and networking equipment like routers and firewalls.

      Options for where to locate this infrastructure were limited. An organization with an ample budget might build an expensive data center consisting of the following:

       Racks to hold servers and networking equipment

       Redundant power sources and backup batteries or generators

       Massive cooling and ventilation systems to keep the equipment from overheating

       Network connectivity within the data center and to outside networks such as the Internet

      Organizations that are less fiscally blessed might rent physical space from a colocation (colo) facility, which is just a data center that leases rack space to the general public. Because colo customers lease only as much space as they need—be it a few rack units, an entire rack, or even multiple racks—this option is much cheaper than building and maintaining a data center from scratch. Customer organizations just have to deal with their own IT equipment and software. To put it in marketing terms, think of a colocation facility as “data center as a service.”

      Related to the data center versus cloud distinction, there are two terms that you need to know. On-premises (on-prem) hosting refers to an organization hosting its own hardware, be it in a data center or a colo. In contrast, cloud computing is an example of off-premises (off-prem) hosting, as the hardware resources are not controlled by the organization that uses them. To make this distinction easy to remember, just equate on-prem with data center and off-prem with cloud.

      This book will reference the National Institute of Standards (NIST) SP 800-145 publication ( https://doi.org/10.6028/NIST.SP.800-145 ) as the main source of cloud computing definitions.