Architecting Cloud Native Applications
上QQ阅读APP看书,第一时间看更新

Resulting context

The primary benefit of this solution is that it provides a fully managed, global scale perimeter around the system that allows self-sufficient, full-stack teams to focus on their value proposition. The perimeter acts as a bulkhead to protect the internals of the system from attacks at the boundaries of the system. These cross-cutting concerns no longer permeate the business logic, which makes that code easier to maintain and removes the need to allocate resources to support processing that logic. The dedicated infrastructure of the API gateway automatically scales elastically.

Security is performed at the edge of the cloud. We typically only consider CDNs for their ability to cache static content at the edge. However, cloud providers recommend the use of their CDNs for all traffic: PUT, POST, DELETE, GET, and so forth, because network level Distributed Denial of Service (DDOS) attacks are handled by the CDN at the edge before they penetrate an availability zone and your system. Thus we limit the attack surface of the system by passing all traffic through the CDN. The API gateway itself provides throttling to protect the system at the application level. Further confidence can be attained as needed by leveraging the cloud provider's WAF that is already integrated with the CDN. Following our cloud-native, reactive architecture, we can process internal and external sources, such as access logs and third-party reputation lists, to continuously and dynamically manage rules, such as denying access based on IP address and mitigating OWASP Top 10 web application vulnerabilities. A self-sufficient, full-stack team would own the components that perform this processing, per usual, while the resulting rules would be leveraged across all services.

Implementing and scaling OAuth 2.0 and OpenID Connect for authorization and authentication, respectively, is a non-trivial exercise. Authentication is best delegated to a third party, such as AWS Cognito, Azure AD, or Auth0. Once a user receives a JSON Web Token (JWT), it is still necessary to authorize these bearer tokens when a service is invoked. An API gateway will perform this authorization at the edge and only pass down valid tokens to the business layer, which greatly simplifies the lower layers. The authorization results are also cached for a time to live to improve responsiveness. Integration with the cloud provider's authentication service is usually a turnkey solution, while implementing custom authorizers is also supported. It is common to use multiple authentication providers, each for a different user base, and implement an authorizer for each as needed. We will discuss user base focused components in the Backend For Frontend pattern.

There are several performance benefits, in addition to being fully managed and auto scaling. The cloud provider's CDN typically has optimized transfer rates between the edge and an availability zone, which improves all traffic, not just GET requests. An API should leverage cache-control headers, even for as little as a few seconds, to absorb some load before it reaches the API gateway. Traffic can be balanced across regions when components are deployed in a multi-region, active-active configuration in order to improve latency for regional users. This same feature provides failover when there is a regional outage. Implementing multi-regional components is a realistic objective when they are implemented using an API gateway, function-as-a-service, and cloud-native databases. Monitoring at the API gateway layer also helps provide a realistic performance picture of users' end-to-end digital experience.

Not all API gateway services are created equal. Some provide many more features than may be necessary for many components, such as a turnkey developer's portal. These features may be worth the added expense for a monetized open API, which you are providing as-a-service to your customers. However, for internal APIs, such as those we discuss in the Backend For Frontend pattern, these features are not necessary and they should not incur the often significant added cost. Other features, such as complex routing, transformation, fan-out, fan-in, and the like, should be avoided, as coding in the API gateway ought to be considered an anti-pattern. You should focus your API gateways on the single responsibility of providing a perimeter around your system's boundaries. Monetization features are great for the components that need them.

Finally, it is not necessary to run all interactions through an API gateway. Third-party services, such as authentication or an offline-first database, will provide their own perimeter and you will just integrate with and leverage those services, as we discuss in the External Service Gateway and Offline-First Database patterns. In addition, some content is dynamic, but not dynamic enough to warrant the use of the full technology stack. Instead, this content can be stored as a materialized view in blob storage and served through the CDN and mapped to the same RESTful resource path to encapsulate the implementation. A PUT to the blob storage would trigger invalidation on the CDN to indicate when the data does actually change. This technique works best for public data, can take a significant load off the API gateway and the internals of the system, and provides extremely high availability.